Simulating Storytime – Neural Dynamics of Parent–Child Brain Synchrony
Author
Paulina Domek
SWPS University
Project Description
Abstract
“Simulating Storytime” aims to develop a computational model of parent–child brain synchrony during shared reading. Empirical studies using EEG hyperscanning suggest that reading printed books fosters stronger inter-brain alignment than screen-based reading, likely due to sustained joint attention and interaction. However, collecting such dual-brain data in real-life settings is methodologically complex and resource-intensive. This project proposes an *in silico* approach: simulating two coupled neural systems (“parent” and “child”) exposed to a shared narrative input. By systematically manipulating interaction strength and distraction, the model will visualize and quantify how different reading environments influence brain-to-brain connectivity. All code and outputs will be open-source, making developmental neuroscience accessible through interactive simulation and data visualization.
Hypothesis
We hypothesize that simulated brain signals driven by a shared rhythmic stimulus (representing a common story) and mutual feedback (representing interaction) will exhibit significantly higher synchrony than signals driven by independent or noisy inputs (representing distractions). Synchrony will be operationalized using Pearson correlation and phase-locking values. Specifically, we expect a clear “synchrony index” to emerge in the joint reading condition, which will decrease when high-frequency noise or uncorrelated inputs are introduced to simulate screen-based or distracted reading.
Method and Workflow
The simulation follows a three-stage pipeline.
First, we generate two independent oscillatory signals representing baseline parent and child brain activity, using EEG-like frequency components (e.g. theta and alpha bands).
Second, we introduce a shared external driver: the amplitude envelope of an audiobook recording, which acts as a common rhythmic stimulus modulating both signals. In the interactive condition, additional bidirectional coupling is added to simulate conversational feedback.
Third, we compute synchrony metrics across different experimental conditions, including joint reading, passive co-presence, and distracted reading.
Tools and Implementation
All simulations will be implemented in Python using standard scientific libraries. NumPy and SciPy will be used for signal generation and filtering, while Matplotlib will be used for visualizations such as time-series plots and correlation heatmaps. The shared story input will be derived from a real audiobook. For validation, the analysis pipeline can be compared conceptually to open dual-EEG datasets, such as the Leong & Wass (2018) mother–infant hyperscanning dataset.
Significance
This project provides a conceptual sandbox for exploring why shared reading is neurobiologically different from solitary or screen-based media use. While the model is necessarily simplified, it allows users to test mechanistic hypotheses about joint attention, interaction, and distraction in a transparent and reproducible way. The simulation translates abstract findings from developmental neuroscience into an intuitive, interactive tool aligned with the open-science and interdisciplinary ethos of Brainhack. Ultimately, it demonstrates how computational modeling can complement empirical research by offering insight into complex social brain dynamics that are otherwise difficult to observe directly.
Project requirements
Education & Background
This project is aimed at students, PhD candidates, and early-career researchers.
We welcome people from different fields, especially neuroscience, psychology, computer science, biology, and physics but also linguistics.
You don’t need to be an expert in everything. The ideal team mixes people who like coding and data with people who understand how the brain and child development work.
Language
A working level of English is enough. You should be comfortable reading scientific papers and communicating with an international team.
Skills & Attitude
Basic to intermediate Python skills will be very useful, especially with libraries like `NumPy`, `SciPy`, or `Matplotlib`.
BUT! If you are already starting – feel also invited, there’s a lot of AI tools which may be helpful with learning during the Brainhack how to solve & code 🙂
Equally important is an interest in open science – all code and results will be shared openly – also we’re using open data 🙂
This is a team project, so a collaborative mindset matters more than perfect technical skills. We’re building something together across disciplines.
Equipment
Just bring your own laptop with Python installed. We will work in Jupyter Notebooks.
No EEG or lab equipment is needed – everything happens in simulation.
In short: if you’re curious about the brain, enjoy working with data (or want to learn), and like interdisciplinary projects, you’re a good fit.
Programming languages used in this project
Python, Libraries and Technical Stack: NumPy and SciPy, Matplotlib / Seaborn, MNE-Python (Potential), Jupyter Notebooks
Who are we looking for?
We are looking for participants with different academic and professional backgrounds, especially:
- Students (Bachelor’s, Master’s, PhD)
- Early-career researchers
- Data scientists / programmers
- Neuroscientists
- Psychologists
- Cognitive scientists
- Biologists
- Physicists
- AI / machine learning enthusiasts
- UX / data visualization designers (optional but welcome)
You do not need to be a specialist in neuroscience. A mix of technical and non-technical profiles is ideal.
What can you gain from participating?
Skills
Participants will learn how to:
- Generate synthetic brain data – create EEG-like signals in Python using NumPy and SciPy.
- Measure synchrony – calculate how similar two signals are using correlation and phase-based methods.
- Visualize results – build plots and heatmaps that show brain-to-brain alignment over time.
- Process signals – apply basic filtering to focus on relevant frequency bands (e.g. alpha, theta).
Neuroscience & Cognitive Understanding
Participants will gain insight into:
- Brain-to-brain synchrony – what hyperscanning is and why it matters in social neuroscience.
- Reading and development – how shared reading supports language and cognitive growth.
- Media effects – why screens and distractions may reduce joint attention and engagement.
Open Science & Collaboration
Participants will also practice:
- Reproducible research – working in Jupyter Notebooks and sharing code via GitHub.
- Interdisciplinary teamwork – collaborating across psychology, neuroscience, and programming.
In short: participants leave with real coding experience, a better understanding of social brain dynamics, and a complete open-source project they can show in their portfolio or CV.
What’s even more important – Team Leader will develop the project WITH the participants and it’s really depending on them 🙂
Key resources
- Carrle FP, Hollenbenders Y, Reichenbach A. Generation of synthetic EEG data for training algorithms supporting the diagnosis of major depressive disorder. Front Neurosci. 2023 Oct 2;17:1219133. doi: 10.3389/fnins.2023.1219133. PMID: 37849893; PMCID: PMC10577178.
https://pmc.ncbi.nlm.nih.gov/articles/PMC10577178/ - Jomaa F, Ebraheem F, Horowitz-Kraus T. Greater Parent-Child Brain Synchronisation During Printed Book Versus Screen Reading Using Hyperscanning Electroencephalograph Data. Acta Paediatr. 2025 Jul;114(7):1633-1641. doi: 10.1111/apa.70007. Epub 2025 Jan 31. PMID: 39891366; PMCID: PMC12147426.
https://pmc.ncbi.nlm.nih.gov/articles/PMC12147426/ - Xu Y, Aubele J, Vigil V, Bustamante AS, Kim YS, Warschauer M. Dialogue with a conversational agent promotes children’s story comprehension via enhancing engagement. Child Dev. 2022 Mar;93(2):e149-e167. doi: 10.1111/cdev.13708. Epub 2021 Nov 8. PMID: 34748214; PMCID: PMC9299009.
https://pmc.ncbi.nlm.nih.gov/articles/PMC9299009/ - Possible Data Set: https://reshare.ukdataservice.ac.uk/853123/
- Context in media: https://neurosciencenews.com/mri-early-reading-brain-activity-1996/
