000 02112nam a2200229Ia 4500
003 NULRC
005 20250520103030.0
008 250520s9999 xx 000 0 und d
020 _a9780137470358
040 _cNULRC
050 _aQ 325.5 .E36 2022
100 _aEkman, Magnus
_eauthor
245 0 _aLearning deep learning :
_btheory and practice of neural networks, computer vision, natural language processing, and transformers using tensorflow /
_cMagnus Ekman
260 _aBoston, Massachusetts :
_bAddision-Wesley Publishing Company,
_cc2022
300 _aliii, 688 pages :
_bcolor illustrations ;
_c24 cm
365 _bUSD56
504 _aIncludes index.
505 _aLearning Deep Learning is a complete guide to DL. Illuminating both the core concepts and the hands-on programming techniques needed to succeed, this text can be used for students with prior programming experience but with no prior machine learning or statistics experience. After introducing the essential building blocks of deep neural networks, such as artificial neurons and fully connected, convolutional, and recurrent layers, Ekman shows how to use them to build advanced architectures, including the Transformer.
520 _a1. The Rosenblatt Perceptron -- 2. Gradient-Based Learning -- 3. Sigmoid neurons and backpropagation -- 4. Fully Connected networks applied to multiclass classification -- 5. Towards DL: Frameworks and networks Tweaks -- 6. Fully Connected Networks Applied to Regression -- 7. Convolutional Neural Networks Applied to Image Classification -- 8. Deeper CNNs and Pretrained Models -- 9. Predicting time sequences with recurrent neural networks -- 10. Long Short-term Memory -- 11. Text Autocompletion with LSTM and Beam Search -- 12. Neural Language Models and Word Embeddings -- 13. Word Embeddings from word2vec and GloVe -- 14. Sequence to sequence networks and natural Language Translation -- 15. Attention and the Transformer -- 16. One-to-many network for image captioning -- 17. Medley of Additional topics -- 18. Summary and Next steps.
650 _aMACHINE LEARNING
942 _2lcc
_cBK
999 _c21844
_d21844