<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Slides | CONECT | Computational Neuroscience Center @ INT</title><link>https://conect-int.github.io/slides/</link><atom:link href="https://conect-int.github.io/slides/index.xml" rel="self" type="application/rss+xml"/><description>Slides</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Mon, 19 Jun 2023 14:00:00 +0000</lastBuildDate><item><title>Computational Neuroscience projet</title><link>https://conect-int.github.io/slides/2023-06-19-conect-centuri-summer-school/</link><pubDate>Mon, 19 Jun 2023 14:00:00 +0000</pubDate><guid>https://conect-int.github.io/slides/2023-06-19-conect-centuri-summer-school/</guid><description>
&lt;section data-noprocess data-shortcode-slide
data-background-image="/media/open-book.jpg"
>
&lt;h2 id="neural-computation-through-population-dynamics">Neural computation through population dynamics&lt;/h2>
&lt;h5 id="computational-neuroscience-project">Computational Neuroscience project&lt;/h5>
&lt;h3 id="centuri-summer-school">CENTURI Summer school&lt;/h3>
&lt;p>&lt;a href="https://conect-int.github.io/talk/2023-06-20-conect-at-the-centuri-summer-school/" target="_blank" rel="noopener">https://conect-int.github.io/talk/2023-06-20-conect-at-the-centuri-summer-school/&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Press &lt;code>S&lt;/code> key to view&lt;/li>
&lt;li>this project is part of the CENTURI summer school - and we would like to thank the organizers of the school&amp;hellip;&lt;/li>
&lt;li>In this short presentation, we will present the challenges that we want to tackle and which we named&amp;hellip;&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="who-are-we">Who are we?&lt;/h2>
&lt;table>
&lt;tr>
&lt;th>&lt;img data-src="https://conect-int.github.io/authors/nicolas-meirhaeghe/avatar.jpg" height="200" />&lt;/th>
&lt;th>&lt;img data-src="https://conect-int.github.io/authors/laurent-u-perrinet/avatar.png" height="200" />&lt;/th>
&lt;/tr>
&lt;tr>
&lt;td>Nicolas&lt;BR>Meirhaeghe&lt;/td>
&lt;td>Laurent&lt;BR>Perrinet&lt;/td>
&lt;/tr>
&lt;/table>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;p>This project is supervised by NM and myself. We are both at the INT, working at the interface between neurophysiology and computational modelling.&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="challenge-brain-decoding">Challenge: brain decoding&lt;/h2>
&lt;span class="fragment " >
&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2023_CENTURI-SummerSchool/main/datasets/dataset1_reaching-task/decoding.png" height="420" />
&lt;/span>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>our brains light up billions of cells, in majority carried by action potentials, or &lt;em>spikes&lt;/em>,&lt;/li>
&lt;li>neural activity is structured in a way that allows agents to act on the world&lt;/li>
&lt;li>we wish to better understand this relationship by using machine learning.&lt;/li>
&lt;/ul>
&lt;p>In this example, a monkey is seeing a display for which a reaching task is associated. at the same time neural activity (raster plot) is recorded in the premotor area. our goal is to be able to design a computational method to predict the actual behavior. achieving to do this allows to better understand computational principles of the brain&lt;/p>
&lt;ul>
&lt;li>application to BCI&lt;/li>
&lt;/ul>
&lt;p>&amp;ldquo;what I can build, I can understand&amp;rdquo;
(to be more modest, as Feynman said “What I cannot build. I do not understand.” )&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="objectives">Objectives&lt;/h2>
&lt;ul>
&lt;li>Learn computational methods to interpret and interrogate neural data&lt;/li>
&lt;li>Learn to reduce the complexity of high-dimensional neural data&lt;/li>
&lt;li>Learn statistical approaches to perform hypothesis-testing on neural data&lt;/li>
&lt;li>Learn the principles of decoding analyses to relate neural data to behavioral data&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTES&lt;/strong>&lt;/p>
&lt;p>The objectives in this project are:
&amp;hellip;&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="datasets">Datasets&lt;/h2>
&lt;ul>
&lt;li>Dataset 1: reaching task (Hatsopoulos et al., J. Neurophysiol., 2004)&lt;/li>
&lt;/ul>
&lt;span class="fragment " >
&lt;ul>
&lt;li>Dataset 2: time interval task (Meirhaeghe et al., Neuron, 2021)&lt;/li>
&lt;/ul>
&lt;/span>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;p>During the project we will focus on two datasets:&lt;/p>
&lt;ul>
&lt;li>&amp;hellip; which is openly available&lt;/li>
&lt;li>the second &amp;hellip; which will be provided during the course&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="dataset-1-reaching-task">Dataset 1: reaching task&lt;/h2>
&lt;span class="fragment " >
&lt;p>&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2023_CENTURI-SummerSchool/main/datasets/dataset1_reaching-task/centerout-task.png" height="200" />&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2023_CENTURI-SummerSchool/main/datasets/dataset1_reaching-task/trajectories.png" height="300" />&lt;/p>
&lt;p>Hatsopoulos, Joshi, and O&amp;rsquo;Leary (2004) &lt;a href="https://journals.physiology.org/doi/full/10.1152/jn.01245.2003" target="_blank" rel="noopener">doi:10.1152/jn.01245.2003&lt;/a>&lt;/p>
&lt;/span>
&lt;span class="fragment " >
&lt;h5 id="goal-decode-intended-arm-movements-from-motor-cortical-activity">Goal: decode intended arm movements from motor cortical activity&lt;/h5>
&lt;/span>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;p>The first dataset is a classic reaching task. it consists of recordings in primary motor (MI) and dorsal premotor (PMd) cortices in behaving monkeys doing a reaching task, that is, instructed to move a cursor from the center to a target.&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="dataset-2-time-interval-task">Dataset 2: time interval task&lt;/h2>
&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2023_CENTURI-SummerSchool/main/datasets/dataset2_time-interval-task/dataset2_fig1A.png" height="300" />
&lt;span class="fragment " >
&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2023_CENTURI-SummerSchool/main/datasets/dataset2_time-interval-task/dataset2_fig2.png" height="300" />
&lt;/span>
&lt;p>Meirhaeghe, Sohn, and Jazayeri (2021) &lt;a href="https://www.cell.com/neuron/fulltext/S0896-6273%2821%2900622-X" target="_blank" rel="noopener">doi:10.1016/j.neuron.2021.08.025&lt;/a>&lt;/p>
&lt;span class="fragment " >
&lt;h5 id="goal-relating-neural-dynamics-to-animals-behavioral-performance">Goal: relating neural dynamics to animals’ behavioral performance&lt;/h5>
&lt;/span>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;p>the second dataset is more challenging and involves :&lt;/p>
&lt;ul>
&lt;li>Monkeys measured time intervals drawn from various distributions&lt;/li>
&lt;li>Activity in the frontal cortex scaled in time with the mean interval&lt;/li>
&lt;li>Temporal scaling allowed time to be encoded predictively relative to the mean&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="dataset-2-time-interval-task-1">Dataset 2: time interval task&lt;/h2>
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/malvache2016.png" height="300" />
&lt;p>Malvache, Reichinnek, Vilette, Haimerl &amp;amp; Cossart (2016) &lt;a href="https://www.science.org/doi/10.1126/science.aaf3319" target="_blank" rel="noopener">doi:10.1126/science.aaf3319&lt;/a>&lt;/p>
&lt;span class="fragment " >
&lt;h5 id="goal-use-precise-spike-times-to-improve-decoding">Goal: use precise spike times to improve decoding&lt;/h5>
&lt;/span>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>
&lt;p>our goal is to improve decoding&lt;/p>
&lt;/li>
&lt;li>
&lt;p>Internal representation of hippocampal neuronal population spans a time-distance continuum.&lt;/p>
&lt;/li>
&lt;li>
&lt;p>yet the domain is vast, and there s lot to do in SNNs&lt;/p>
&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;!--
---
## Dataset 2: time interval task
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/haimerl2019.jpg" height="300" />
Haimerl, Angulo-Garcia *et al*, (2019) [doi:10.1073/pnas.1718518116](https://doi.org/10.1073/pnas.1718518116)
##### Goal: use precise spike times to improve decoding
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>
&lt;p>our goal is to improve decoding&lt;/p>
&lt;/li>
&lt;li>
&lt;p>Internal representation of hippocampal neuronal population spans a time-distance continuum.&lt;/p>
&lt;/li>
&lt;li>
&lt;p>yet the domain is vast, and there s lot to do in SNNs&lt;/p>
&lt;/li>
&lt;/ul>
&lt;/aside> -->
&lt;hr>
&lt;h1 id="questions">Questions?&lt;/h1>
&lt;ul>
&lt;li>home page: &lt;a href="https://conect-int.github.io/talk/2022-06-20-conect-at-the-centuri-summer-school/" target="_blank" rel="noopener">https://conect-int.github.io/talk/2022-06-20-conect-at-the-centuri-summer-school/&lt;/a>&lt;/li>
&lt;li>Contact us @ &lt;a href="mailto:nmrghe@gmail.com,laurent.perrinet@univ-amu.fr">nicolas.meirhaeghe@univ-amu.fr, laurent.perrinet@univ-amu.fr&lt;/a>&lt;/li>
&lt;li>GitHub repository: &lt;a href="https://github.com/CONECT-INT/2023_CENTURI-SummerSchool" target="_blank" rel="noopener">https://github.com/CONECT-INT/2023_CENTURI-SummerSchool&lt;/a>&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>we look forward to start working with you on this project !&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h1 id="questions-1">Questions?&lt;/h1>
&lt;img data-src="data:image/gif;base64,R0lGODlhpACkAJEAAAAAAP///wAAAAAAACH5BAEAAAIALAAAAACkAKQAAAL/jI+py+0Po5y02ouz3rz7D4biSJbmiabqyrbuC8fyTNd2DOT6zvc+4wsCIMKiMYfYUY5M3bLJBEKJ0GrSOaken1qhtEntbg/KrDjIPfO+0YfaeEWa32VJvXJnx9X6DZqMZZHnFohX2GdAN7QwKPgDKOcYGdFIechYqIiY8Zd4affZUBm26HCniTmJ0RkwSqj6WtoKByl7SqtxC2uKOzs55rkrmtmra9u7SizMxmrcHLoJ7Hv8i3xhnPZcXbSXtqdNHc6avC1LCv79GGwem97TXQsifT6VKq4+7V2LHv8xH2vFnrs1/UCV47dOnjVeXaIVgyZwHz5s/hYOaxhxHUJL/wcnKiOY8NpHeCFLJqCo0eOyjNM2btRnEmU+ls5UsmN4713BliNFliM50yTQmjqFtpNYNCjPnz7DDe2pQKY0V1Eh4nyKMSZUQ9x2wuR4EyxWLWOBZrNJLuymr0TbaGUqyYvXOWpZsv0Y8K3Tpi/pnn1TdqfUrYPhLuVDF5XewIsFE36saGXVyHsbW1ZauDJmynUnR2a82XBm0G3P0DBtr+1AeklvdEDtWTXSzoepuuZKNjXe3ZJrW7391+1Jm7JxjntR2uJAmaDHHo9dLxdv4czQ0gyVvDdsTtP/QU9aibnjrsajc+8IBqB1z2bHyy2f3sPU7tOX00f/fnjv6vvVg/+/n1OA2QmIT3sGgXRefgOuNtuCDRY414G2sYaggynhZ6FvrUUI1oTXZaUUKRQ296AwHl5Im3PmnRiibqJhZxGLz11EIHktXuWfZuHF+BuDB9pno4y/uQSjjQbSyOJXJKKYmXhGXSZkf3E5ZOJD+B2544tS3vhakZoBqeN6+oU1ml8jZPlliWH+V2Wba6bIoUJGNsUfbeCgOeCMcuaXVp0jzuanmnpWNOeULtpJnJd5QiiCd7gFCl+AkGrIZ46kienjmGBi2WOchFYIGahuorkph1HCKR2bbzp5GJUKeqnkmWIWx+WMGZJqJnC2KhrqnyheWiiXe+L63Zu+tupeqez/nbDrqL1aCmiyaq4l64ZPInvZpERiCGu1ZDKqqbSRNkmHiN5O6uqRyob22VGN8qrds+OGqhiNJOA5pLxIzsqZuamuCiC+qi4a5Ipb9ekewd8WPLAYYJb5r4rcWktppvXmxS4H5E6M4Lp3OiyovsHZRnK+zlJ8arqeSgiuuivXyuOWxGbsh3Ivl3xyxy9zqiWq+1Ls8rU8Vyq0XeIKy2y386qaqcQAc8zCzHVuq7NeG9fos8ZK//zqlR+KKum0K0h9KNUzmw1gcB6bt67Tjjoaa6Rrxzdms1jbPeiPiaUdcs5ng1xskkbLLTaT4L4d8OE2680yw0RTqvDfDTued4dW/z6O8XZOP8z3v/MFmzng4f7aN9Za04046qJvXnrbfMF7t8gJb+2vWDsfqq3BhrG+LOO2F014sZz37CrZOPouOOnCt34076P/Dn3yzw8P9q2wX60wtb5TT+vnC3dtOvO9P1r25db7TW+ioHc6Pvf1FQ6z43NTd3v7X0ueZuOIog/0p7XbvSTLxYt/VQsBzsI2PlYdS3uQY1+C/qc44r1ugN+DW80o2LT60Wx7paNa0KZ3P9q5C2n2M5zqJMjBBEbQZwocXAPl17wRFk+ExyMhCE0oHIgJsHoHw50PlWcvU2FKg1N7H/CwVTcjBvFafTkihbpnsgJ+DINCZJoLR6bEA/8KDIFLjB/YGBg3rtWOgZXboveKKDMHljGK6HLhdsyoxjj274hbHGMHe4jDMLaxhDdEIhlFB8c07q6K3xuaxVr2wz5msIbG86IhowXEBd4MkSp00x8HaTUlSpKOlHxeI/E3Q6/pD4yQhN4hKSfHx6ntjjBMIRK9d6rKHYsz+eviKcEXy07Osl8arCP8ZAm/tETucqdJH8o0ebQ1WtEG2NsaAGdHwRa6oJmWLCWwCnhL11CzkNZMZiebuEPN7Y1um3ylvqSZRMBMkH7BS6exvCnIdsUFY+X0IDxZyMt5sg1ay4OS7Gp4F0y+8GlflGEigenErw3tjL9kIz/9aFDk4XH/oG6b6AeniM8tXRONBc3jJKOJTEYSk5DCNGbsQIRQT46UiQ7MZilpSauAgu+R+uzZME+IzREylKUafWkWYSpK8nnUnxwLJDf5iLY5qrSVG6Ro+aqJVH4FK5QdbSo9g0nSg3Yueiu16j6H2kdfZlKpFbXpOXVHUOlRT6ZJfScOdWhB/aX0fFzcJlwXF05dQrOq9nzrWac6TlVeMaLu3Cm7lJkbTsYTYUub6V61iFY7AjRXXDNshjgKzg9qdrB+XeZhvwlIh8o1tFDFzV3J+Vi8uo+pm+2nV1E7VnZWlrRHNe1fZXvTjMrTkeu8ZupOmlPgCHe4xC2ucY+L3OQqd7nMD22uc58L3ehKd7rUZUEBAAA7" height="300" /></description></item><item><title>CONECT thematic day on Spiking Neural Networks</title><link>https://conect-int.github.io/slides/2023-03-28-conect-seminar-day-on-snns/</link><pubDate>Tue, 28 Mar 2023 10:00:00 +0000</pubDate><guid>https://conect-int.github.io/slides/2023-03-28-conect-seminar-day-on-snns/</guid><description>
&lt;section data-noprocess data-shortcode-slide
data-background-image="/media/open-book.jpg"
>
&lt;h1 id="spiking-neural-networks">Spiking Neural Networks&lt;/h1>
&lt;h2 id="conect-thematic-day">CONECT thematic day&lt;/h2>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Press &lt;code>S&lt;/code> key to view&lt;/li>
&lt;li>Hi, I am LP and in the name of CONECT, we look forward to discuss on SNNs&lt;/li>
&lt;li>as part of the CONECT&amp;hellip;&lt;/li>
&lt;li>In this short presentation, we will present the challenges that we want to tackle and which we named&amp;hellip;&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;img data-src="https://conect-int.github.io/slides/conect/CONECT-logo.png" height="200" />
&lt;p>&lt;a href="https://conect-int.github.io" target="_blank" rel="noopener">CONECT: Computational Neuroscience Center @ INT&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;p>-so, what is CONECT?&lt;/p>
&lt;ul>
&lt;li>
&lt;p>CONECT is Computational Neuroscience Center @ INT, bringing together a core of theoretician&lt;/p>
&lt;/li>
&lt;li>
&lt;p>aims at making bridges in neuroscience&lt;/p>
&lt;/li>
&lt;li>
&lt;p>and across the community&lt;/p>
&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="challenge-visual-latencies">Challenge: Visual latencies&lt;/h2>
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/visual-latency-estimate.jpg" height="420" />
&lt;p>&lt;a href="https://doi.org/10.1126/science.1058249" target="_blank" rel="noopener">Thorpe &amp;amp; Fabre-Thorpe, 2001&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>
&lt;p>In particular in our group, we are interested in dynamics of neural processing&lt;/p>
&lt;/li>
&lt;li>
&lt;p>The visual system is very efficient in generating a decision from the retinal image to the different stages of the visual pathways, here for a macaque monkey, a reaction of finger muscles in about 300 milliseconds.&lt;/p>
&lt;/li>
&lt;li>
&lt;p>the process of categorizing an object takes 10 layers&lt;/p>
&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="challenge-visual-latencies-1">Challenge: Visual latencies&lt;/h2>
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/visual-latency.jpg" height="420" />
&lt;p>Review on &lt;a href="https://laurentperrinet.github.io/publication/grimaldi-22-polychronies/" target="_blank" rel="noopener">Precise Spiking Motifs&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>
&lt;p>the latencies are of similar in the human brain but merely scaled due to the brain size&lt;/p>
&lt;/li>
&lt;li>
&lt;p>as a consequence, it is thought that this efficiency is achieved by spikes that is, brief all-or-none events which are passed in the very large network which forms the brain from assemblies of neurons to others.&lt;/p>
&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="key-spiking-neural-networks">Key: Spiking Neural Networks&lt;/h2>
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/replicating_MainenSejnowski1995.png" height="420" />
&lt;p>&lt;a href="https://github.com/SpikeAI/2022_polychronies-review/blob/main/src/Figure_2_MainenSejnowski1995.ipynb" target="_blank" rel="noopener">Mainen Sejnowski, 1995&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>reproduucibility&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="key-spiking-neural-networks-1">Key: Spiking Neural Networks&lt;/h2>
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/Diesmann_et_al_1999.png" height="420" />
&lt;p>&lt;a href="https://github.com/SpikeAI/2022_polychronies-review/blob/main/src/Figure_3_Diesmann_et_al_1999.py" target="_blank" rel="noopener">Diesmann et al. 1999&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>&amp;ldquo;This hypothesis is reviewed with respect to our knowledge of the neurobiology, for instance in the hippocampus of rodents. We also review&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="hypothesis-spiking-motifs">Hypothesis: Spiking motifs&lt;/h2>
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/haimerl2019.jpg" height="420" />
&lt;p>Review on &lt;a href="https://laurentperrinet.github.io/publication/grimaldi-22-polychronies/" target="_blank" rel="noopener">Precise Spiking Motifs&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>This hypothesis is reviewed with respect to our knowledge of the neurobiology, for instance in the hippocampus of rodents. We also review&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="hypothesis-spiking-motifs-1">Hypothesis: Spiking motifs&lt;/h2>
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/Ikegaya2004zse0150424620001.jpeg" height="420" />
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>numerous and extensive work on mechanisms which may allow the neural system to learn to actually use that precise spiking motifs by attuning the delay between pairs of neurons.&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="hypothesis-spiking-motifs-2">Hypothesis: Spiking motifs&lt;/h2>
&lt;img data-src="https://github.com/SpikeAI/2022_polychronies-review/raw/main/figures/izhikevich.png" height="420" />
&lt;p>Review on &lt;a href="https://laurentperrinet.github.io/publication/grimaldi-22-polychronies/" target="_blank" rel="noopener">Precise Spiking Motifs&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>
&lt;p>Izhikevich polychronization&lt;/p>
&lt;/li>
&lt;li>
&lt;p>yet the domain is vast, and there s lot to do in SNNs&lt;/p>
&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="todays-program">Today&amp;rsquo;s program&amp;hellip;&lt;/h2>
&lt;table>
&lt;tr>
&lt;th>&lt;img data-src="https://www.cwi.nl/intranet/faces/1152.jpg" height="175" />&lt;/th>
&lt;th>&lt;img data-src="https://laurentperrinet.github.io/author/antoine-grimaldi/avatar_hu85406bb2d5f7db2dce1cab01b4e48063_27520_270x270_fill_q75_lanczos_center.jpg" height="175" />&lt;/th>
&lt;th>&lt;img data-src="https://3ia.univ-cotedazur.eu/medias/photo/benoit-miramond_1621434732805-png?ID_FICHE=1087703" height="175" />&lt;/th>
&lt;th>&lt;img data-src="https://phd-seminars-sam.inria.fr/files/2019/04/photo_Andrea_Castagnetti-235x300.jpg" height="175" />&lt;/th>
&lt;th>&lt;img data-src="https://media.licdn.com/dms/image/C4D03AQG1wCHtwVhGYg/profile-displayphoto-shrink_400_400/0/1582485965416?e=1685577600&amp;v=beta&amp;t=oUiVlWlAQLG9rnz0nu0r-TdZ2LftDopThqB51nx4vQc" height="175" />&lt;/th>
&lt;/tr>
&lt;tr>
&lt;td>Sander&lt;BR>Bohte&lt;/td>
&lt;td>Antoine&lt;BR>Grimaldi&lt;/td>
&lt;td>Benoit&lt;BR>Miramond&lt;/td>
&lt;td>Andrea&lt;BR>Castagnetti&lt;/td>
&lt;td>Yann&lt;BR>Cherdo&lt;/td>
&lt;/tr>
&lt;/table>
&lt;p>&lt;a href="https://conect-int.github.io/talk/2023-03-28-conect-thematic-day-on-spiking-neural-networks/" target="_blank" rel="noopener">Program &amp;amp; more&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>&lt;/li>
&lt;/ul>
&lt;/aside></description></item><item><title>Computational Neuroscience projet</title><link>https://conect-int.github.io/slides/2022-06-20-conect-centuri-summer-school/</link><pubDate>Mon, 20 Jun 2022 14:00:00 +0000</pubDate><guid>https://conect-int.github.io/slides/2022-06-20-conect-centuri-summer-school/</guid><description>
&lt;section data-noprocess data-shortcode-slide
data-background-image="/media/open-book.jpg"
>
&lt;h1 id="computational-neuroscience-projet">Computational Neuroscience projet&lt;/h1>
&lt;h2 id="centuri-summer-school">CENTURI Summer school&lt;/h2>
&lt;p>&lt;a href="https://conect-int.github.io/talk/2022-06-20-conect-at-the-centuri-summer-school/" target="_blank" rel="noopener">https://conect-int.github.io/talk/2022-06-20-conect-at-the-centuri-summer-school/&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Press &lt;code>S&lt;/code> key to view&lt;/li>
&lt;li>Hi, we are LP and NM and we look forward to start working with you on this project&lt;/li>
&lt;li>as part of the CENTURI summer school - and we would like to thank the organizers of the school&amp;hellip;&lt;/li>
&lt;li>In this short presentation, we will present the challenges that we want to tackle and which we named&amp;hellip;&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="who-are-we">Who are we?&lt;/h2>
&lt;table>
&lt;tr>
&lt;th>&lt;img data-src="https://conect-int.github.io/authors/nicolas-meirhaeghe/avatar.jpg" height="200" />&lt;/th>
&lt;th>&lt;img data-src="https://conect-int.github.io/authors/laurent-u-perrinet/avatar.png" height="200" />&lt;/th>
&lt;/tr>
&lt;tr>
&lt;td>Nicolas&lt;BR>Meirhaeghe&lt;/td>
&lt;td>Laurent&lt;BR>Perrinet&lt;/td>
&lt;/tr>
&lt;/table>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;p>blah blas blah&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="challenge-brain-decoding">Challenge: brain decoding&lt;/h2>
&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2022_CENTURI-SummerSchool/main/datasets/dataset1_reaching-task/decoding.png" height="420" />
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTE&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>our brains light up billions of cells in a structured way,&lt;/li>
&lt;li>neural activity is in majority carried by action potentials, or &lt;em>spikes&lt;/em>,&lt;/li>
&lt;li>we wish to better understand this structure by using machine learning.&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="objectives">Objectives&lt;/h2>
&lt;ul>
&lt;li>Learn computational methods to interpret and interrogate neural data&lt;/li>
&lt;li>Learn to reduce the complexity of high-dimensional neural data&lt;/li>
&lt;li>Learn statistical approaches to perform hypothesis-testing on neural data&lt;/li>
&lt;li>Learn the principles of decoding analyses to relate neural data to behavioral data&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;p>&lt;strong>2 MINUTES&lt;/strong>&lt;/p>
&lt;p>blah blas blah&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="datasets">Datasets&lt;/h2>
&lt;ul>
&lt;li>Dataset 1: reaching task (Hatsopoulos et al., J. Neurophysiol., 2004)&lt;/li>
&lt;li>Dataset 2: grasping task (Brochier et al., Sci. Data, 2018)&lt;/li>
&lt;li>Dataset 3: time interval task (Meirhaeghe et al., Neuron, 2021)&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;p>blah blas blah&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="dataset-1-reaching-task">Dataset 1: reaching task&lt;/h2>
&lt;h5 id="goal-decode-intended-arm-movements-from-motor-cortical-activity">Goal: decode intended arm movements from motor cortical activity&lt;/h5>
&lt;p>&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2022_CENTURI-SummerSchool/main/datasets/dataset1_reaching-task/centerout-task.png" height="200" />&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2022_CENTURI-SummerSchool/main/datasets/dataset1_reaching-task/trajectories.png" height="300" />&lt;/p>
&lt;p>Hatsopoulos, Joshi, and O&amp;rsquo;Leary (2004) &lt;a href="https://journals.physiology.org/doi/full/10.1152/jn.01245.2003" target="_blank" rel="noopener">doi:10.1152/jn.01245.2003&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;p>blah blas blah&lt;/p>
&lt;/aside>
&lt;!--
---
## Dataset 1: reaching task
&lt;img class="fragment" data-src="https://raw.githubusercontent.com/CONECT-INT/2022_CENTURI-SummerSchool/main/datasets/dataset1_reaching-task/dataset1_fig1.jpeg" height="350" /> &lt;img class="fragment" data-src="https://raw.githubusercontent.com/CONECT-INT/2022_CENTURI-SummerSchool/main/datasets/dataset1_reaching-task/dataset1_fig4.jpeg" height="350" />
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;p>blah blas blah
blah blas blah&lt;/p>
&lt;/aside> -->
&lt;hr>
&lt;h2 id="dataset-2-grasping-task">Dataset 2: grasping task&lt;/h2>
&lt;h5 id="goal-predicting-animals-reaction-times-from-neural-preparatory-activity">Goal: predicting animals’ reaction times from neural preparatory activity&lt;/h5>
&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2022_CENTURI-SummerSchool/main/datasets/dataset2_grasping-task/reach2grasp-task.png" height="250" />
&lt;p>Brochier, Zehl, Hao, Duret, Sprenger, Denker, Grün, &amp;amp; Riehle (2018) Scientific Data 5 : 180055. &lt;a href="https://www.nature.com/articles/sdata201855" target="_blank" rel="noopener">doi:10.1038/sdata.2018.55&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;p>blah blas blah&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="dataset-3-time-interval-task">Dataset 3: time interval task&lt;/h2>
&lt;h5 id="goal-relating-neural-dynamics-to-animals-behavioral-performance">Goal: relating neural dynamics to animals’ behavioral performance&lt;/h5>
&lt;img data-src="https://raw.githubusercontent.com/CONECT-INT/2022_CENTURI-SummerSchool/main/datasets/dataset3_time-interval-task/dataset3_fig1A.png" height="300" />
&lt;p>Meirhaeghe, Sohn, and Jazayeri (2021) &lt;a href="https://www.biorxiv.org/content/10.1101/2021.03.10.434831v1" target="_blank" rel="noopener">doi:10.1016/j.neuron.2021.08.025 &lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;p>&lt;strong>1 MINUTE&lt;/strong>&lt;/p>
&lt;p>blah blas blah&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h1 id="questions">Questions?&lt;/h1>
&lt;ul>
&lt;li>home page: &lt;a href="https://conect-int.github.io/talk/2022-06-20-conect-at-the-centuri-summer-school/" target="_blank" rel="noopener">https://conect-int.github.io/talk/2022-06-20-conect-at-the-centuri-summer-school/&lt;/a>&lt;/li>
&lt;li>Contact us @ &lt;a href="mailto:nmrghe@gmail.com,laurent.perrinet@univ-amu.fr">nicolas.meirhaeghe@univ-amu.fr, laurent.perrinet@univ-amu.fr&lt;/a>&lt;/li>
&lt;li>GitHub repository: &lt;a href="https://github.com/CONECT-INT/2022_CENTURI-SummerSchool" target="_blank" rel="noopener">https://github.com/CONECT-INT/2022_CENTURI-SummerSchool&lt;/a>&lt;/li>
&lt;/ul></description></item><item><title>Slides</title><link>https://conect-int.github.io/slides/conect/</link><pubDate>Tue, 05 Feb 2019 00:00:00 +0000</pubDate><guid>https://conect-int.github.io/slides/conect/</guid><description>
&lt;section data-noprocess data-shortcode-slide
data-background-image="/media/open-book.jpg"
>
&lt;h1 id="conect">CoNeCt&lt;/h1>
&lt;h2 id="the-computational-neuroscience-center--int">the &lt;strong>Co&lt;/strong>mputational &lt;strong>Ne&lt;/strong>uroscience &lt;strong>C&lt;/strong>en&lt;strong>t&lt;/strong>er @ INT&lt;/h2>
&lt;p>&lt;a href="https://conect-int.github.io" target="_blank" rel="noopener">https://conect-int.github.io&lt;/a>&lt;/p>
&lt;aside class="notes">
&lt;ul>
&lt;li>CONECT with one N&lt;/li>
&lt;li>Press &lt;code>S&lt;/code> key to view&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="tremendous-technological-advances">Tremendous technological advances&lt;/h2>
&lt;ul>
&lt;li>two photon imaging&lt;/li>
&lt;li>large population recording-array technologies&lt;/li>
&lt;li>optogenetic circuit control tools&lt;/li>
&lt;li>transgenic manipulations&lt;/li>
&lt;li>large volume circuit reconstructions&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;p>Tremendous technological advances over the past decade&lt;/p>
&lt;ul>
&lt;li>These experiments have begun to produce a huge amount of data, on a broad spectrum of temporal and spatial scales,&lt;/li>
&lt;li>providing finer and more quantitative descriptions of the biological reality than we would have been able to dream of only a decade ago.&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="a-transdisciplinary-revolution">A transdisciplinary revolution&lt;/h2>
&lt;ul>
&lt;li>across several disciplines (physics, genetics, biology, robotics, psychiatry, ..)&lt;/li>
&lt;li>and multiple scales (from micro to macro, from short to long-term, from theory to biology)&lt;/li>
&lt;li>new frontiers&lt;span class="fragment " >
… and new challenges
&lt;/span>&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;ul>
&lt;li>daunting complexity of the biological reality revealed by these technologies highlights the importance of neurophysics&lt;/li>
&lt;li>to provide a conceptual bridge between abstract principles of brain function and their biological implementations within neural circuits.&lt;/li>
&lt;li>This revolution is accompanied by a parallel revolution in the domain of Artificial Intelligence. An exponential number of algorithms in sensory processing, such as image classification, or reinforcement learning have realized practical tools which have replaced the classical tools we were using on a daily basis by a novel range of intelligent tools of a new generation.&lt;/li>
&lt;li>&lt;strong>This is the context in which we are creating CONECT.&lt;/strong>&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="conect-computational-neuroscience-center">CoNeCt: &lt;strong>Co&lt;/strong>mputational &lt;strong>Ne&lt;/strong>uroscience &lt;strong>C&lt;/strong>en&lt;strong>t&lt;/strong>er&lt;/h2>
&lt;ul>
&lt;li>close collaboration between experimentalists and theoreticians&lt;/li>
&lt;li>share state-of-the-art (experimentalists well aware of theoretical approaches, experimental techniques for theoreticians)&lt;/li>
&lt;li>complementary in its purpose from neuroinformatics&amp;hellip; &lt;span class="fragment " >
but distinct
&lt;/span>&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;ul>
&lt;li>We are convinced that close collaboration between experimentalists and theoreticians in neuroscience is essential to develop mechanistic as well as quantitative understandings of how the brain performs its functions. This is in fact a primary motivating force in establishing this center.&lt;/li>
&lt;li>However, for such collaborations to be effective, experimentalists must be well aware of the approaches and challenges in modeling while theoreticians must be well acquainted with the experimental techniques, their power and the challenges they present.&lt;/li>
&lt;li>CoNeCt has also the ambition to contribute to the training of a new generation of neuroscientists who will have all these qualities.&lt;/li>
&lt;/ul>
&lt;p>This approach is therefore complementary but distinct in its purpose from neuroinformatics (creation of tools for analyzing neuroscientific data) or artificial intelligence (creation of algorithms inspired by the functioning of the brain). The field of computational neuroscience is still young but its community is now structured in an autonomous community with strong interaction with the other branches of neuroscience. It is this autonomy that we want to foster at INT.&lt;/p>
&lt;/aside>
&lt;hr>
&lt;h2 id="objectives-of-conect">Objectives of CoNeCt&lt;/h2>
&lt;ul>
&lt;li>
&lt;p>to create a space for scientific discussion and animation&lt;/p>
&lt;/li>
&lt;li>
&lt;p>train students and staff and attract young researchers:&lt;/p>
&lt;/li>
&lt;li>
&lt;p>structuring the network of computational neurosciences at INT, on Timone, on AMU and in France &amp;amp; International&lt;/p>
&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;ul>
&lt;li>lots of work - bottom approach so far&lt;/li>
&lt;li>no action taken&lt;/li>
&lt;li>lots of work - existence of top-down initiatives&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="actions-of-conect">Actions of CoNeCt&lt;/h2>
&lt;ul>
&lt;li>
&lt;div class="view-list view-list-item">
&lt;i class="far fa-newspaper pub-icon" aria-hidden="true">&lt;/i>
&lt;a href="https://conect-int.github.io/post/actors-conect/" >Actors of CONECT&lt;/a>
&lt;/div>
&lt;/li>
&lt;li>
&lt;div class="view-list view-list-item">
&lt;i class="far fa-newspaper pub-icon" aria-hidden="true">&lt;/i>
&lt;a href="https://conect-int.github.io/post/objectives-conect/" >Objectives of CONECT&lt;/a>
&lt;/div>
&lt;/li>
&lt;li>
&lt;p>&lt;a href="https://conect-int.github.io/event">Past events&lt;/a> and future&lt;/p>
&lt;/li>
&lt;/ul>
&lt;aside class="notes">
&lt;ul>
&lt;li>Actors&lt;/li>
&lt;li>we already organized events within or outside INT&lt;/li>
&lt;li>objectives : exist&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h1 id="questions">Questions?&lt;/h1>
&lt;p>&lt;a href="https://conect-int.github.io" target="_blank" rel="noopener">https://conect-int.github.io&lt;/a>&lt;/p>
&lt;p>&lt;a href="mailto://int-conect@univ-amu.fr">Contact us @ int-conect@univ-amu.fr!&lt;/a>&lt;/p>
&lt;p>&lt;a href="https://framateam.org/int-marseille/channels/conect" target="_blank" rel="noopener">Let&amp;rsquo;s discuss on Mattermost&lt;/a>&lt;/p></description></item><item><title>Slides</title><link>https://conect-int.github.io/slides/example/</link><pubDate>Tue, 05 Feb 2019 00:00:00 +0000</pubDate><guid>https://conect-int.github.io/slides/example/</guid><description>&lt;h1 id="create-slides-in-markdown-with-hugo-blox-builder">Create slides in Markdown with Hugo Blox Builder&lt;/h1>
&lt;p>&lt;a href="https://hugoblox.com/" target="_blank" rel="noopener">Hugo Blox Builder&lt;/a> | &lt;a href="https://docs.hugoblox.com/content/slides/" target="_blank" rel="noopener">Documentation&lt;/a>&lt;/p>
&lt;hr>
&lt;h2 id="features">Features&lt;/h2>
&lt;ul>
&lt;li>Efficiently write slides in Markdown&lt;/li>
&lt;li>3-in-1: Create, Present, and Publish your slides&lt;/li>
&lt;li>Supports speaker notes&lt;/li>
&lt;li>Mobile friendly slides&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="controls">Controls&lt;/h2>
&lt;ul>
&lt;li>Next: &lt;code>Right Arrow&lt;/code> or &lt;code>Space&lt;/code>&lt;/li>
&lt;li>Previous: &lt;code>Left Arrow&lt;/code>&lt;/li>
&lt;li>Start: &lt;code>Home&lt;/code>&lt;/li>
&lt;li>Finish: &lt;code>End&lt;/code>&lt;/li>
&lt;li>Overview: &lt;code>Esc&lt;/code>&lt;/li>
&lt;li>Speaker notes: &lt;code>S&lt;/code>&lt;/li>
&lt;li>Fullscreen: &lt;code>F&lt;/code>&lt;/li>
&lt;li>Zoom: &lt;code>Alt + Click&lt;/code>&lt;/li>
&lt;li>&lt;a href="https://revealjs.com/pdf-export/" target="_blank" rel="noopener">PDF Export&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="code-highlighting">Code Highlighting&lt;/h2>
&lt;p>Inline code: &lt;code>variable&lt;/code>&lt;/p>
&lt;p>Code block:&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="line">&lt;span class="cl">&lt;span class="n">porridge&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;blueberry&amp;#34;&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="k">if&lt;/span> &lt;span class="n">porridge&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="s2">&amp;#34;blueberry&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl"> &lt;span class="nb">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;Eating...&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;hr>
&lt;h2 id="math">Math&lt;/h2>
&lt;p>In-line math: $x + y = z$&lt;/p>
&lt;p>Block math:&lt;/p>
&lt;p>$$
f\left( x \right) = ;\frac{{2\left( {x + 4} \right)\left( {x - 4} \right)}}{{\left( {x + 4} \right)\left( {x + 1} \right)}}
$$&lt;/p>
&lt;hr>
&lt;h2 id="fragments">Fragments&lt;/h2>
&lt;p>Make content appear incrementally&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" class="chroma">&lt;code class="language-fallback" data-lang="fallback">&lt;span class="line">&lt;span class="cl">{{% fragment %}} One {{% /fragment %}}
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">{{% fragment %}} **Two** {{% /fragment %}}
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">{{% fragment %}} Three {{% /fragment %}}
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Press &lt;code>Space&lt;/code> to play!&lt;/p>
&lt;span class="fragment " >
One
&lt;/span>
&lt;span class="fragment " >
&lt;strong>Two&lt;/strong>
&lt;/span>
&lt;span class="fragment " >
Three
&lt;/span>
&lt;hr>
&lt;p>A fragment can accept two optional parameters:&lt;/p>
&lt;ul>
&lt;li>&lt;code>class&lt;/code>: use a custom style (requires definition in custom CSS)&lt;/li>
&lt;li>&lt;code>weight&lt;/code>: sets the order in which a fragment appears&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="speaker-notes">Speaker Notes&lt;/h2>
&lt;p>Add speaker notes to your presentation&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" class="chroma">&lt;code class="language-markdown" data-lang="markdown">&lt;span class="line">&lt;span class="cl">{{% speaker_note %}}
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="k">-&lt;/span> Only the speaker can read these notes
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="k">-&lt;/span> Press &lt;span class="sb">`S`&lt;/span> key to view
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl"> {{% /speaker_note %}}
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Press the &lt;code>S&lt;/code> key to view the speaker notes!&lt;/p>
&lt;aside class="notes">
&lt;ul>
&lt;li>Only the speaker can read these notes&lt;/li>
&lt;li>Press &lt;code>S&lt;/code> key to view&lt;/li>
&lt;/ul>
&lt;/aside>
&lt;hr>
&lt;h2 id="themes">Themes&lt;/h2>
&lt;ul>
&lt;li>black: Black background, white text, blue links (default)&lt;/li>
&lt;li>white: White background, black text, blue links&lt;/li>
&lt;li>league: Gray background, white text, blue links&lt;/li>
&lt;li>beige: Beige background, dark text, brown links&lt;/li>
&lt;li>sky: Blue background, thin dark text, blue links&lt;/li>
&lt;/ul>
&lt;hr>
&lt;ul>
&lt;li>night: Black background, thick white text, orange links&lt;/li>
&lt;li>serif: Cappuccino background, gray text, brown links&lt;/li>
&lt;li>simple: White background, black text, blue links&lt;/li>
&lt;li>solarized: Cream-colored background, dark green text, blue links&lt;/li>
&lt;/ul>
&lt;hr>
&lt;section data-noprocess data-shortcode-slide
data-background-image="/media/boards.jpg"
>
&lt;h2 id="custom-slide">Custom Slide&lt;/h2>
&lt;p>Customize the slide style and background&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" class="chroma">&lt;code class="language-markdown" data-lang="markdown">&lt;span class="line">&lt;span class="cl">{{&lt;span class="p">&amp;lt;&lt;/span> &lt;span class="nt">slide&lt;/span> &lt;span class="na">background-image&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;/media/boards.jpg&amp;#34;&lt;/span> &lt;span class="p">&amp;gt;&lt;/span>}}
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">{{&lt;span class="p">&amp;lt;&lt;/span> &lt;span class="nt">slide&lt;/span> &lt;span class="na">background-color&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;#0000FF&amp;#34;&lt;/span> &lt;span class="p">&amp;gt;&lt;/span>}}
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">{{&lt;span class="p">&amp;lt;&lt;/span> &lt;span class="nt">slide&lt;/span> &lt;span class="na">class&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;my-style&amp;#34;&lt;/span> &lt;span class="p">&amp;gt;&lt;/span>}}
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;hr>
&lt;h2 id="custom-css-example">Custom CSS Example&lt;/h2>
&lt;p>Let&amp;rsquo;s make headers navy colored.&lt;/p>
&lt;p>Create &lt;code>assets/css/reveal_custom.css&lt;/code> with:&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" class="chroma">&lt;code class="language-css" data-lang="css">&lt;span class="line">&lt;span class="cl">&lt;span class="p">.&lt;/span>&lt;span class="nc">reveal&lt;/span> &lt;span class="nt">section&lt;/span> &lt;span class="nt">h1&lt;/span>&lt;span class="o">,&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="p">.&lt;/span>&lt;span class="nc">reveal&lt;/span> &lt;span class="nt">section&lt;/span> &lt;span class="nt">h2&lt;/span>&lt;span class="o">,&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="p">.&lt;/span>&lt;span class="nc">reveal&lt;/span> &lt;span class="nt">section&lt;/span> &lt;span class="nt">h3&lt;/span> &lt;span class="p">{&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl"> &lt;span class="k">color&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="kc">navy&lt;/span>&lt;span class="p">;&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="p">}&lt;/span>
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;hr>
&lt;h1 id="questions">Questions?&lt;/h1>
&lt;p>&lt;a href="https://discord.gg/z8wNYzb" target="_blank" rel="noopener">Ask&lt;/a>&lt;/p>
&lt;p>&lt;a href="https://docs.hugoblox.com/content/slides/" target="_blank" rel="noopener">Documentation&lt;/a>&lt;/p></description></item></channel></rss>