Feeding data from multiple h5py files to TensorFlow -
i have relatively infinite training , validation data. stored in multiple h5py files in format ready training. every h5py file consist of 'x' , 'y' datasets having 10 000 of samples per file.
i feeling quiet lost on how feed data tf model efficiently. please point me right direction. best practice feed tensorflow model data when training. need build tensorflow dataset, reader, ques? how should done right way? highly appreciate if give code example on doing it.
thank sharing knowledge!
wiki
Comments
Post a Comment