-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathindex.html
More file actions
414 lines (365 loc) · 24.4 KB
/
index.html
File metadata and controls
414 lines (365 loc) · 24.4 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
<html>
<head>
<meta charset="UTF-8">
<meta name ="viewport" content="width=device-width, initial-scale=1.0">
<title>Machine Learning for Computational Engineering</title>
<meta name="description" content="Swan is an open-source topology optimization toolbox capable of performing structural and material design.">
<link rel="stylesheet" href="styles.css">
<link href='https://fonts.googleapis.com/css?family=Poppins:400,600,700' rel='stylesheet'>
<link rel="shortcut icon" type="image/png" href="website/images/Favicon.png">
</head>
<body>
<div class="swanhead">
<div class="top-navbar">
<a class="logo">
<!---<img src="website/images/swanlogo_black.png" height="45px">--->
</a>
<a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering" class="ghlogo">
<img src="images/GitHub-Mark-32px.png"" height="28px">
</a>
<a href="#experience" class="top-navbar-link">Experience</a>
<a href="#reviews" class="top-navbar-link">Reviews</a>
<a href="#practice" class="top-navbar-link">Practice</a>
<a href="#theory" class="top-navbar-link">Theory</a>
<a href="#thecourse" class="top-navbar-link">The course</a>
<a href="#teachers" class="top-navbar-link">Teachers</a>
</div>
<div class="swanhead-content">
<div class="swanhead-features">
<div class="swanhead-features-LHS">
<div class="swanhead-text">
<div>
<div class="headline">
Machine Learning for Computational Engineering
</div>
<div class="headline-sub">
Join us in this free course where the travellers will have a first glance at some basic aspects on machine learning. A basic introduction on optimization and statistics will fill the traveller backpack to better enjoy the machine learning journey. For a prompt integration of the concepts described, these will be presented in a Computational Engineering language. The practical sessions of the course will allow the travellers to become familiar with the basic machine learning culture.
</div>
</div>
</div>
</div>
<div class="swanhead-features-RHS">
<!-- Its underwhelming to say the least-->
<img src="Practice/optgif.gif" height="350px"/>
</div>
<div class="swanhead-features-RHS">
</div>
</div>
</div>
</div>
<div class="swancontent">
<!-- SWAN: CONTRIBUTORS -->
<div class="section-start">
<a id="teachers"></a>
<div class="section-header">Meet the teachers</div>
</div>
<div class="section-contributors">
<div class="section-contributors-contributor" >
<div class="contributor-image"> <img src="teachers/alexferrer.png"/></div>
<div style="padding: 0 3em">
<span class="contributor-name">Àlex Ferrer</span><br/>
<span class="contributor-text">
I'm an associate researcher at CIMNE and a "Serra-Hunter" lecturer professor at the Physics Department at Universitat Politècnica de Catatlunya in Barcelona. I obtained my PhD in Computational Mechanics at UPC in Multi-scale topology optimization. I recently finished a three-year postdoc at Ecole Polytecnique in Paris funded by a Marie Curie Individual fellowship awarded in 2019. I'm currently spending one research yeat at the Mathematical Institute, University of Oxford.
I am currently teaching Flight mechanics and Machine Learning at Universitat Politècnica de Catalunya.
<br>
Contact: <span style="color:blue">alex.ferrer@upc.edu </span>
</span>
</div>
</div>
<div class="section-contributors-contributor" >
<div class="contributor-image"> <img src="teachers/toni.png"/></div>
<div style="padding: 0 3em">
<span class="contributor-name">Toni Darder</span><br/>
<span class="contributor-text">
I am Antonio Darder, an aerospace engineer passionate about space, aviation and numerical methods. I developed my undergrad final project on deep learning and computer vision techniques. Nowadays I am strudying a Master in Aerospace Engineering at ISAE-Supaero while researching in machine learning techniques for model predictive control.
<br/>Contact: <span style="color:blue">tonidarder@gmail.com</span>
</div>
</div>
</div> <!-- end of contributors -->
<!-- SWAN: FEATURES -->
<div class="section-start">
<a id="thecourse"></a>
<div class="section-header">The course</div>
</div>
<div class="timetable">
<table class="timetable-table">
<tbody>
<tr>
<th>Day 1</th>
<th>Day 2</th>
<th>Day 3</th>
<th>Day 4</th>
</tr>
<tr>
<td><a href="#theory1" class="timetable-lesson">Basics in continuous optimization (Part I)</a><br>Àlex Ferrer</td>
<td><a href="#theory3" class="timetable-lesson">Basics in continuous optimization (Part II)</a><br>Àlex Ferrer</td>
<td><a href="#theory5" class="timetable-lesson">Optimization in Computational Engineering</a><br>Àlex Ferrer</td>
<td><a href="#theory7" class="timetable-lesson">Statistical Learning</a><br>Àlex Ferrer</td>
</tr>
<tr>
<td><a href="#theory2" class="timetable-lesson">Applications in Machine Learning</a><br>Àlex Ferrer</td>
<td><a href="#theory4" class="timetable-lesson">Supervised Learning (Part I)</a><br>Àlex Ferrer</td>
<td><a href="#theory6" class="timetable-lesson">Supervised Learning (Part II)</a><br>Àlex Ferrer</td>
<td><a href="#theory8" class="timetable-lesson">Unsupervised Learning</a><br>Àlex Ferrer</td>
</tr>
<tr>
<td><a href="#practice1" class="timetable-lesson">Practical Session I</a><br>Antonio Darder</td>
<td><a href="#practice2" class="timetable-lesson">Practical Session II</a><br>Antonio Darder</td>
<td><a href="#practice3" class="timetable-lesson">Practical Session III</a><br>Antonio Darder</td>
<td><a href="#practice4" class="timetable-lesson">Practical Session IV</a><br>Antonio Darder</td>
</tr>
</tbody></table>
</div>
<!-- SWAN: EXAMPLES -->
<div class="section-start">
<a id="theory"></a>
<div class="section-header">Theory lessons</div>
</div>
<div class="section-examples">
<div class ="section-newExample">
<a id="theory1"></a>
<div class="section-newExample-imgHorizontal">
<iframe width="100%" height="100%" src="https://www.youtube.com/embed/ssrCbiaKuJU" title="Basics in continuous optimization Part I" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Lecture 1</span>
</div>
<div class="examples-name">Basics in continuous optimization (Part I)</div>
<div class="examples-text">
An introduction to continuous optimization will be presented followed by some basic examples on least squares and linear (and sequential quadratic) programming. Then, unconstrained optimization will be discussed jointly with gradient-based algorithms. This module will end presenting some line-search methods (or learning rates) jointly with some illustrative examples.
<br/><a href="https://drive.google.com/file/d/1L0KJDkWPi2UMBKwauWvRo1yIIuRDYYNK/view">Lecture1.pdf</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="theory2"></a>
<div class="section-newExample-imgHorizontal">
<iframe width="100%" height="100%" src="https://www.youtube.com/embed/ssrCbiaKuJU" title="Basics in continuous optimization Part I" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Lecture 2</span>
</div>
<div class="examples-name">Applications in Machine Learning</div>
<div class="examples-text">
This module will serve as a first taste on machine learning. We will present the similarities and differences between machine learning and classical optimization. First examples on regression will allow us to internalise the optimization concepts explained in the first sessions. Mean square errors, sparse approximation, lasso problems, ridge regression, L1 norm optimization will be some examples.
<br/><a href="https://drive.google.com/file/d/1oBWZCdeCrBcYBB9Zg0q8gHewpahJSgeG/view">Lecture2.pdf</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="theory3"></a>
<div class="section-newExample-imgHorizontal">
<iframe width="100%" height="100%" src="https://www.youtube.com/embed/GNkfw74mxs0" title="Basics in continuous optimization Part II" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Lecture 3</span>
</div>
<div class="examples-name">Basics in continuous optimization (Part II)</div>
<div class="examples-text">
This second excursion to continuous optimization will be more focused on constrained optimization problems where duality will play a central role. Legendre-fenchel transform, lagrange multipliers, penalty methods, KKT conditions and other concepts in constrained optimization will be described. Useful algorithms like SQP, augmented lagrangian, trust-region and interior point methods will be presented. The theory will be assisted by several illustrative examples.
<br/><a href="https://drive.google.com/file/d/1KlP0PSK3v0_SsNN69vqiuifxOAhzgoNV/view">Lecture3.pdf</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="theory4"></a>
<div class="section-newExample-imgHorizontal">
<iframe width="100%" height="100%" src="https://www.youtube.com/embed/ScazFBZ9v48" title=" Supervised Learning Part I" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Lecture 4</span>
</div>
<div class="examples-name">Supervised Learning (Part I)</div>
<div class="examples-text">
We will start with polynomial regression to understand the capacity of the machine learning models. We will analyze the underfitting-overfitting trade-off and the importance of regularisation and the hyperparameters. In this module, we will also present logistic regression and the main classification models including multi-classification. We will end by introducing the support vector machine problem.
<br/><a href="https://drive.google.com/file/d/1zJ1IxcEdhgceatM9tDYsFWrpFDqXe3q/view">Lecture4.pdf</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="theory5"></a>
<div class="section-newExample-imgHorizontal">
<iframe width="100%" height="100%" src="https://www.youtube.com/embed/WLxJLF0PLCk" title=" Optimization in Computational Engineering " frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Lecture 5</span>
</div>
<div class="examples-name">Optimization in Computational Engineering</div>
<div class="examples-text">
We will start with polynomial regression to understand the capacity of the machine learning models. We will analyze the underfitting-overfitting trade-off and the importance of regularisation and the hyperparameters. In this module, we will also present logistic regression and the main classification models including multi-classification. We will end by introducing the support vector machine problem.
<br/><a href="https://drive.google.com/file/d/11Mos0r-fjKfgP5c1fdWulLokaApPvNGp/view">Lecture5.pdf</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="theory6"></a>
<div class="section-newExample-imgHorizontal">
<iframe width="100%" height="100%" src="https://www.youtube.com/embed/DsSGEmHjrD0" title="Supervised Learning II" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Lecture 6</span>
</div>
<div class="examples-name">Supervised Learning (Part II)</div>
<div class="examples-text">
We will start with polynomial regression to understand the capacity of the machine learning models. We will analyze the underfitting-overfitting trade-off and the importance of regularisation and the hyperparameters. In this module, we will also present logistic regression and the main classification models including multi-classification. We will end by introducing the support vector machine problem.
<br/><a href="https://drive.google.com/file/d/1UlmTaZziwN4Sh5P5gBxklwNljSljXHw0/view">Lecture6.pdf</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="theory7"></a>
<div class="section-newExample-imgHorizontal">
<iframe width="100%" height="100%" src="https://www.youtube.com/embed/Kt-72Tdfsiw" title="Statistical Learning" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Lecture 7</span>
</div>
<div class="examples-name">Statistical Learning</div>
<div class="examples-text">
Machine learning requires some important concepts from statistics. In this module, the necessary ingredients of statistics will be gradually introduced. We will move from reviewing basic concepts from probability and statistics to the study of bayesian inference and some useful parameter estimators. The maximum likelihood estimator and maximum a posteriori estimation will allow us to provide a statistical insight to regression and classification models.
<br/><a href="https://drive.google.com/file/d/1RG7jFC8OgCVvtyuE2ZmGxZglv9kTlJik/view">Lecture7.pdf</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="theory8"></a>
<div class="section-newExample-imgHorizontal">
<iframe width="100%" height="100%" src="https://www.youtube.com/embed/PDAPeRW1rDg" title="Unsupervised Learning" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Lecture 8</span>
</div>
<div class="examples-name">Unsupervised Learning</div>
<div class="examples-text">
We will present in this module the clustering problem to motivate the importance of unsupervised learning. We will start with the K-means algorithm and the expectation maximization algorithm. Then, we will present the PCA algorithm for dimensional reduction. We will end presenting some more advanced unsupervised learning like independent component analysis (ICA) and non-negative matrix factorization problems.
<br/><a href="https://drive.google.com/file/d/1K0--58kMdVm26-qNdc6RQsccEqckZNRo/view">Lecture8.pdf</a>
</div>
</div>
</div>
</div>
<div class="section-start">
<a id="practice"></a>
<div class="section-header">Practical sessions</div>
</div>
<div class="section-examples">
<div class ="section-newExample">
<a id="practice1"></a>
<div class="section-newExample-imgHorizontal">
<img src="images/practice/session1.png"></img>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Pratice 1</span>
</div>
<div class="examples-name">Regressions</div>
<div class="examples-text">
We will start with linear and polynomial regression, internalizing the importance of the selected norm and the training and test splitting, the regularisation and the hyperparameters selection.
<br/><a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering/blob/main/Practice/Sess1/1_LinearRegression(Solution).ipynb">Python notebook (.ipynb)</a>
<br/><a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering/blob/main/Practice/Sess1/Session_1_LinearRegression.pdf">Guide (PDF)</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="practice2"></a>
<div class="section-newExample-imgHorizontal">
<img src="images/practice/session2.png"></img>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Practice 2</span>
</div>
<div class="examples-name">Overfitting and underfitting</div>
<div class="examples-text">
We will continue with regression models practicing underfitting and overfitting concepts. Then, we will move to classification problems by solving our first logistic regression problem.</div>
<a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering/blob/main/Practice/Sess2/2_LogisticRegression(Solution).ipynb">Python notebook (.ipynb)</a>
<br/><a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering/blob/main/Practice/Sess2/Session_2_LogisticRegression.pdf">Guide (PDF)</a>
</div>
</div>
<div class ="section-newExample">
<a id="practice3"></a>
<div class="section-newExample-imgHorizontal">
<img src="images/practice/session3.png"></img>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Practice 3</span>
</div>
<div class="examples-name">Neural network</div>
<div class="examples-text">
We will consolidate the classification concepts by solving the support vector machine problem. We will then code our own neural network using backpropagation, automatic differentiation and stochastic gradient. We will practice with several optimization strategies like the learning rate and acceleration selection.
<br/><a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering/blob/main/Practice/Sess3/3_NN(Solution).ipynb">Python notebook (.ipynb)</a>
<br/><a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering/blob/main/Practice/Sess3/Session_3_NeuralNetwork.pdf">Guide (PDF)</a>
</div>
</div>
</div>
<div class ="section-newExample">
<a id="practice4"></a>
<div class="section-newExample-imgHorizontal">
<img src="images/practice/session4.png"></img>
</div>
<div class="section-newExample-content">
<div class="section-newExample-tags">
<span class="tag-example">Practice 4</span>
</div>
<div class="examples-name">Clustering and PCA</div>
<div class="examples-text">
In this last session, we will code the k-means algorithm for clustering. Then, we will then use the PCA algorithm for reducing the dimensions of the features. We will end experiencing how PCA can be coupled with other machine learning algorithms.
<br/><a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering/blob/main/Practice/Sess4/4_Kmeans_PCA(solution).ipynb">Python notebook (.ipynb)</a>
<br/><a href="https://github.com/SwanLab/MachineLearningForComputationalEngineering/blob/main/Practice/Sess4/Session_4_Kmeans_PCA.pdf">Guide (PDF)</a>
</div>
</div>
</div>
</div>
<!--tfgs/tfms guais + posar algo de projectes europeus?-->
<!-- SWAN: REVIEWS -->
<div class="section-start">
<a id="reviews"></a>
<div class="section-header">Reviews</div>
</div>
<div class="meet-the-teachers-carousel">
<div class="meet-the-teachers-carousel-image">
<img src="survey/veryinteresting.png" height="100%"/>
<img src="survey/useful.png" height="100%"/>
<img src="survey/ratelevel.png" height="100%"/>
<img src="survey/teachers.png" height="100%"/>
<img src="survey/likedtheoretical.png" height="100%"/>
<img src="survey/likedpractical.png" height="100%"/>
</div>
</div>
<div class="reviews">
<div class="reviews-quote">
"I am very satisfied with the Summer School, the theorical part was very clear and the exercises were reasonable to do."
</div>
<div class="reviews-quote">
"I think the format is great!"
</div>
</div>
<div class="section-start">
<a id="experience"></a>
<div class="section-header">Experience</div>
</div>
<div class="meet-the-teachers-carousel">
<div class="meet-the-teachers-carousel-image">
<img src="images/fullhouse.jpeg" height="100%"/>
<img src="images/session1.jpeg" height="100%"/>
<img src="images/sessionPractice.jpeg" height="100%"/>
<img src="images/bonet.jpeg" height="100%"/>
<img src="images/sessionPractice2.jpeg" height="100%"/>
</div>
</div>
<div style="margin: 2em 0">
This course was prepared as part of the CIMNE Summer School in 2022. The event
was held in the North Campus of the Technical University of Catalonia (UPC) from July 4th
to July 8th, 2022. It was coordinated by Àlex Ferrer, with Ricardo Rossi, Sergio Zlotnik,
Jordi Pons-Prats, Lucia Barbu and Xavier Martinez as organizers.
</div>
</div>
</body>