• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Paul Clapham
  • Ron McLeod
  • Liutauras Vilda
  • Bear Bibeault
Sheriffs:
  • Jeanne Boyarsky
  • Tim Cooke
  • Devaka Cooray
Saloon Keepers:
  • Tim Moores
  • Tim Holloway
  • Piet Souris
  • salvin francis
  • Stephan van Hulst
Bartenders:
  • Frits Walraven
  • Carey Brown
  • Jj Roberts

TensorFlow 2.0 in Action: Using TensorFlow (and Bert?) In Production

 
Ranch Hand
Posts: 35
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Thushan.

What this world really needs (well, after universal peace, health, wealth and happiness) is an example of using TensorFlow (and Bert too) in production on a big project. This would include:

+ Using the tf.data API to stream data from disk during training, so that is does not have to all be held in memory.
+ Using TensorFlow Serving, or another robust method, to create a webservice (capable of servicing concurrent requests of course) that can take input data and return predictions from a trained TensorFlow model. Perhaps the predictions would be the top 5 labels together with the % confidence for each label?
+ Saving the trained model and loading it again to test doing predictions.
+ For NLP, obviously the input data for the webservice would have to be transformed in the same way by the webservice as the training data was transformed. That bit of pre-processing code would have to be factored out into a separate function.
+ Deploying a trained model to TensorFlow Lite, so that it can run directly in an Android app. I have done this for a small CNN TensorFlow model. It was a bit fiddly to work out.

Is that too much to ask for in your book?


Thanks
Don.
 
Author
Posts: 24
2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Don,

Yes, I hear you. All the points you made are valid. However, problem arises when writing a book for general audience. I have to make sacrifices between basic and advance topics covered.

+ Using the tf.data API to stream data from disk during training, so that is does not have to all be held in memory.

 This will be covered for e.g. Ch06, Ch07 both will take about using data generators to retrieve data from disk in batches

+ Using TensorFlow Serving, or another robust method, to create a webservice (capable of servicing concurrent requests of course) that can take input data and return predictions from a trained TensorFlow model. Perhaps the predictions would be the top 5 labels together with the % confidence for each label?

 I'm not sure about this one. Yes serving is an important topic but, I might exhaust the ToC before reaching this topic. If that's the case, I'll make sure I incorporate this to second edition

+ Saving the trained model and loading it again to test doing predictions.

 This will be covered throughout

+ For NLP, obviously the input data for the webservice would have to be transformed in the same way by the webservice as the training data was transformed. That bit of pre-processing code would have to be factored out into a separate function.

 Defining NLP preprocessing pipelines will be covered. It wouldn't really matter where the data is coming from, all data will go through the same steps.

+ Deploying a trained model to TensorFlow Lite, so that it can run directly in an Android app. I have done this for a small CNN TensorFlow model. It was a bit fiddly to work out.

 Again, this topic probably will not be covered (at least in this edition).


Again thank you for the great suggestions.

 
I don't like that guy. The tiny ad agrees with me.
Thread Boost feature
https://coderanch.com/t/674455/Thread-Boost-feature
reply
    Bookmark Topic Watch Topic
  • New Topic