Home > Computing and Information Technology > Computer science > Artificial intelligence > Machine learning > Deep Learning for Physical Scientists: Accelerating Research with Machine Learning
7%
Deep Learning for Physical Scientists: Accelerating Research with Machine Learning

Deep Learning for Physical Scientists: Accelerating Research with Machine Learning

          
5
4
3
2
1

International Edition


Premium quality
Premium quality
Bookswagon upholds the quality by delivering untarnished books. Quality, services and satisfaction are everything for us!
Easy Return
Easy return
Not satisfied with this product! Keep it in original condition and packaging to avail easy return policy.
Certified product
Certified product
First impression is the last impression! Address the book’s certification page, ISBN, publisher’s name, copyright page and print quality.
Secure Checkout
Secure checkout
Security at its finest! Login, browse, purchase and pay, every step is safe and secured.
Money back guarantee
Money-back guarantee:
It’s all about customers! For any kind of bad experience with the product, get your actual amount back after returning the product.
On time delivery
On-time delivery
At your doorstep on time! Get this book delivered without any delay.
Quantity:
Add to Wishlist

About the Book

Discover the power of machine learning in the physical sciences with this one-stop resource from a leading voice in the field Deep Learning for Physical Scientists: Accelerating Research with Machine Learning delivers an insightful analysis of the transformative techniques being used in deep learning within the physical sciences. The book offers readers the ability to understand, select, and apply the best deep learning techniques for their individual research problem and interpret the outcome. Designed to teach researchers to think in useful new ways about how to achieve results in their research, the book provides scientists with new avenues to attack problems and avoid common pitfalls and problems. Practical case studies and problems are presented, giving readers an opportunity to put what they have learned into practice, with exemplar coding approaches provided to assist the reader. From modelling basics to feed-forward networks, the book offers a broad cross-section of machine learning techniques to improve physical science research. Readers will also enjoy: A thorough introduction to the basic classification and regression with perceptrons An exploration of training algorithms, including back propagation and stochastic gradient descent and the parallelization of training An examination of multi-layer perceptrons for learning from descriptors and de-noising data Discussions of recurrent neural networks for learning from sequences and convolutional neural networks for learning from images A treatment of Bayesian optimization for tuning deep learning architectures Perfect for academic and industrial research professionals in the physical sciences, Deep Learning for Physical Scientists: Accelerating Research with Machine Learning will also earn a place in the libraries of industrial researchers who have access to large amounts of data but have yet to learn the techniques to fully exploit that access.

Table of Contents:
About the Authors xi Acknowledgements xii 1 Prefix – Learning to “Think Deep” 1 1.1 So What Do I Mean by Changing the Way You Think? 2 2 Setting Up a Python Environment for Deep Learning Projects 5 2.1 Python Overview 5 2.2 Why Use Python for Data Science? 6 2.3 Anaconda Python 7 2.3.1 Why Use Anaconda? 7 2.3.2 Downloading and Installing Anaconda Python 7 2.3.2.1 Installing TensorFlow 9 2.4 Jupyter Notebooks 10 2.4.1 Why Use a Notebook? 10 2.4.2 Starting a Jupyter Notebook Server 11 2.4.3 Adding Markdown to Notebooks 12 2.4.4 A Simple Plotting Example 14 2.4.5 Summary 16 3 Modelling Basics 17 3.1 Introduction 17 3.2 Start Where You Mean to Go On – Input Definition and Creation 17 3.3 Loss Functions 18 3.3.1 Classification and Regression 19 3.3.2 Regression Loss Functions 19 3.3.2.1 Mean Absolute Error 19 3.3.2.2 Root Mean Squared Error 19 3.3.3 Classification Loss Functions 20 3.3.3.1 Precision 21 3.3.3.2 Recall 21 3.3.3.3 F1 Score 22 3.3.3.4 Confusion Matrix 22 3.3.3.5 (Area Under) Receiver Operator Curve (AU-ROC) 23 3.3.3.6 Cross Entropy 25 3.4 Overfitting and Underfitting 28 3.4.1 Bias–Variance Trade-Off 29 3.5 Regularisation 31 3.5.1 Ridge Regression 31 3.5.2 LASSO Regularisation 33 3.5.3 Elastic Net 34 3.5.4 Bagging and Model Averaging 34 3.6 Evaluating a Model 35 3.6.1 Holdout Testing 35 3.6.2 Cross Validation 36 3.7 The Curse of Dimensionality 37 3.7.1 Normalising Inputs and Targets 37 3.8 Summary 39 Notes 39 4 Feedforward Networks and Multilayered Perceptrons 41 4.1 Introduction 41 4.2 The Single Perceptron 41 4.2.1 Training a Perceptron 41 4.2.2 Activation Functions 42 4.2.3 Back Propagation 43 4.2.3.1 Weight Initialisation 45 4.2.3.2 Learning Rate 46 4.2.4 Key Assumptions 46 4.2.5 Putting It All Together in TensorFlow 47 4.3 Moving to a Deep Network 49 4.4 Vanishing Gradients and Other “Deep” Problems 53 4.4.1 Gradient Clipping 54 4.4.2 Non-saturating Activation Functions 54 4.4.2.1 ReLU 54 4.4.2.2 Leaky ReLU 56 4.4.2.3 ELU 57 4.4.3 More Complex Initialisation Schemes 57 4.4.3.1 Xavier 58 4.4.3.2 He 58 4.4.4 Mini Batching 59 4.5 Improving the Optimisation 60 4.5.1 Bias 60 4.5.2 Momentum 63 4.5.3 Nesterov Momentum 63 4.5.4 (Adaptive) Learning Rates 63 4.5.5 AdaGrad 64 4.5.6 RMSProp 65 4.5.7 Adam 65 4.5.8 Regularisation 66 4.5.9 Early Stopping 66 4.5.10 Dropout 68 4.6 Parallelisation of learning 69 4.6.1 Hogwild! 69 4.7 High and Low-level Tensorflow APIs 70 4.8 Architecture Implementations 72 4.9 Summary 73 4.10 Papers to Read 73 5 Recurrent Neural Networks 77 5.1 Introduction 77 5.2 Basic Recurrent Neural Networks 77 5.2.1 Training a Basic RNN 78 5.2.2 Putting It All Together in TensorFlow 79 5.2.3 The Problem with Vanilla RNNs 81 5.3 Long Short-Term Memory (LSTM) Networks 82 5.3.1 Forget Gate 82 5.3.2 Input Gate 84 5.3.3 Output Gate 84 5.3.4 Peephole Connections 85 5.3.5 Putting It All Together in TensorFlow 86 5.4 Gated Recurrent Units 87 5.4.1 Putting It All Together in TensorFlow 88 5.5 Using Keras for RNNs 88 5.6 Real World Implementations 89 5.7 Summary 89 5.8 Papers to Read 90 6 Convolutional Neural Networks 93 6.1 Introduction 93 6.2 Fundamental Principles of Convolutional Neural Networks 94 6.2.1 Convolution 94 6.2.2 Pooling 95 6.2.2.1 Why Use Pooling? 95 6.2.2.2 Types of Pooling 96 6.2.3 Stride and Padding 99 6.2.4 Sparse Connectivity 101 6.2.5 Parameter Sharing 101 6.2.6 Convolutional Neural Networks with TensorFlow 102 6.3 Graph Convolutional Networks 103 6.3.1 Graph Convolutional Networks in Practice 104 6.4 Real World Implementations 107 6.5 Summary 108 6.6 Papers to Read 108 7 Auto-Encoders 111 7.1 Introduction 111 7.1.1 Auto-Encoders for Dimensionality Reduction 111 7.2 Getting a Good Start – Stacked Auto-Encoders, Restricted Boltzmann Machines, and Pretraining 115 7.2.1 Restricted Boltzmann Machines 115 7.2.2 Stacking Restricted Boltzmann Machines 118 7.3 Denoising Auto-Encoders 120 7.4 Variational Auto-Encoders 121 7.5 Sequence to Sequence Learning 125 7.6 The Attention Mechanism 126 7.7 Application in Chemistry: Building a Molecular Generator 127 7.8 Summary 132 7.9 Real World Implementations 132 7.10 Papers to Read 132 8 Optimising Models Using Bayesian Optimisation 135 8.1 Introduction 135 8.2 Defining Our Function 135 8.3 Grid and Random Search 136 8.4 Moving Towards an Intelligent Search 137 8.5 Exploration and Exploitation 137 8.6 Greedy Search 138 8.6.1 Key Fact One – Exploitation Heavy Search is Susceptible to Initial Data Bias 139 8.7 Diversity Search 141 8.8 Bayesian Optimisation 142 8.8.1 Domain Knowledge (or Prior) 142 8.8.2 Gaussian Processes 145 8.8.3 Kernels 146 8.8.3.1 Stationary Kernels 146 8.8.3.2 Noise Kernel 147 8.8.4 Combining Gaussian Process Prediction and Optimisation 149 8.8.4.1 Probability of Improvement 149 8.8.4.2 Expected Improvement 150 8.8.5 Balancing Exploration and Exploitation 151 8.8.6 Upper and Lower Confidence Bound Algorithm 151 8.8.7 Maximum Entropy Sampling 152 8.8.8 Optimising the Acquisition Function 153 8.8.9 Cost Sensitive Bayesian Optimisation 155 8.8.10 Constrained Bayesian Optimisation 158 8.8.11 Parallel Bayesian Optimisation 158 8.8.11.1 qEI 158 8.8.11.2 Constant Liar and Kriging Believer 160 8.8.11.3 Local Penalisation 162 8.8.11.4 Parallel Thompson Sampling 162 8.8.11.5 K-Means Batch Bayesian Optimisation 162 8.9 Summary 163 8.10 Papers to Read 163 Case Study 1 Solubility Prediction Case Study 167 CS 1.1 Step 1 – Import Packages 167 CS 1.2 Step 2 – Importing the Data 168 CS 1.3 Step 3 – Creating the Inputs 168 CS 1.4 Step 4 – Splitting into Training and Testing 168 CS 1.5 Step 5 – Defining Our Model 169 CS 1.6 Step 6 – Running Our Model 169 CS 1.7 Step 7 – Automatically Finding an Optimised Architecture Using Bayesian Optimisation 170 Case Study 2 Time Series Forecasting with LSTMs 173 CS 2.1 Simple LSTM 173 CS 2.2 Sequence-to-Sequence LSTM 177 Case Study 3 Deep Embeddings for Auto-Encoder-Based Featurisation 185 Index 190


Best Sellers


Product Details
  • ISBN-13: 9781119408338
  • Publisher: John Wiley & Sons Inc
  • Publisher Imprint: John Wiley & Sons Inc
  • Height: 244 mm
  • No of Pages: 208
  • Spine Width: 18 mm
  • Weight: 503 gr
  • ISBN-10: 1119408334
  • Publisher Date: 21 Oct 2021
  • Binding: Hardback
  • Language: English
  • Returnable: N
  • Sub Title: Accelerating Research with Machine Learning
  • Width: 170 mm


Similar Products

How would you rate your experience shopping for books on Bookswagon?

Add Photo
Add Photo

Customer Reviews

REVIEWS           
Click Here To Be The First to Review this Product
Deep Learning for Physical Scientists: Accelerating Research with Machine Learning
John Wiley & Sons Inc -
Deep Learning for Physical Scientists: Accelerating Research with Machine Learning
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

Deep Learning for Physical Scientists: Accelerating Research with Machine Learning

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book
    Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    New Arrivals


    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!
    ASK VIDYA