Feeding data from multiple h5py files to TensorFlow -




i have relatively infinite training , validation data. stored in multiple h5py files in format ready training. every h5py file consist of 'x' , 'y' datasets having 10 000 of samples per file.

i feeling quiet lost on how feed data tf model efficiently. please point me right direction. best practice feed tensorflow model data when training. need build tensorflow dataset, reader, ques? how should done right way? highly appreciate if give code example on doing it.

thank sharing knowledge!





wiki

Comments

Popular posts from this blog

Asterisk AGI Python Script to Dialplan does not work -

python - Read npy file directly from S3 StreamingBody -

kotlin - Out-projected type in generic interface prohibits the use of metod with generic parameter -