Learning deep learning : (Record no. 21844)

MARC details
000 -LEADER
fixed length control field 02112nam a2200229Ia 4500
003 - CONTROL NUMBER IDENTIFIER
control field NULRC
005 - DATE AND TIME OF LATEST TRANSACTION
control field 20250520103030.0
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 250520s9999 xx 000 0 und d
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 9780137470358
040 ## - CATALOGING SOURCE
Transcribing agency NULRC
050 ## - LIBRARY OF CONGRESS CALL NUMBER
Classification number Q 325.5 .E36 2022
100 ## - MAIN ENTRY--PERSONAL NAME
Personal name Ekman, Magnus
Relator term author
245 #0 - TITLE STATEMENT
Title Learning deep learning :
Remainder of title theory and practice of neural networks, computer vision, natural language processing, and transformers using tensorflow /
Statement of responsibility, etc. Magnus Ekman
260 ## - PUBLICATION, DISTRIBUTION, ETC.
Place of publication, distribution, etc. Boston, Massachusetts :
Name of publisher, distributor, etc. Addision-Wesley Publishing Company,
Date of publication, distribution, etc. c2022
300 ## - PHYSICAL DESCRIPTION
Extent liii, 688 pages :
Other physical details color illustrations ;
Dimensions 24 cm
365 ## - TRADE PRICE
Price amount USD56
504 ## - BIBLIOGRAPHY, ETC. NOTE
Bibliography, etc. note Includes index.
505 ## - FORMATTED CONTENTS NOTE
Formatted contents note Learning Deep Learning is a complete guide to DL. Illuminating both the core concepts and the hands-on programming techniques needed to succeed, this text can be used for students with prior programming experience but with no prior machine learning or statistics experience. After introducing the essential building blocks of deep neural networks, such as artificial neurons and fully connected, convolutional, and recurrent layers, Ekman shows how to use them to build advanced architectures, including the Transformer.
520 ## - SUMMARY, ETC.
Summary, etc. 1. The Rosenblatt Perceptron -- 2. Gradient-Based Learning -- 3. Sigmoid neurons and backpropagation -- 4. Fully Connected networks applied to multiclass classification -- 5. Towards DL: Frameworks and networks Tweaks -- 6. Fully Connected Networks Applied to Regression -- 7. Convolutional Neural Networks Applied to Image Classification -- 8. Deeper CNNs and Pretrained Models -- 9. Predicting time sequences with recurrent neural networks -- 10. Long Short-term Memory -- 11. Text Autocompletion with LSTM and Beam Search -- 12. Neural Language Models and Word Embeddings -- 13. Word Embeddings from word2vec and GloVe -- 14. Sequence to sequence networks and natural Language Translation -- 15. Attention and the Transformer -- 16. One-to-many network for image captioning -- 17. Medley of Additional topics -- 18. Summary and Next steps.
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element MACHINE LEARNING
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Source of classification or shelving scheme Library of Congress Classification
Koha item type Books
Holdings
Withdrawn status Lost status Source of classification or shelving scheme Damaged status Not for loan Collection Home library Current library Shelving location Date acquired Source of acquisition Cost, normal purchase price Total checkouts Full call number Barcode Date last seen Copy number Price effective from Koha item type
    Library of Congress Classification     Machine Learning LRC - Main National University - Manila General Circulation 05/14/2024 Purchased - Amazon 56.00   GC Q 325.5 .E36 2022 NULIB000019603 05/20/2025 c.1 05/20/2025 Books