December 19, 2019

Your Brain on the Dance Floor

Neural Oscillations Support Musical Rhythm Perception

While it seems rather banal, humans’ ability to detect the beat in music is, in fact, a rare feat of computation. Although recent evidence suggests that some animal species may be capable of synchronizing their motor activity to a repeating auditory stimulus (see Alex the parrot and Ronan the sea lion [1]), humans have the unique ability to infer periodicity in auditory stimuli which may not explicitly contain that periodicity.
What does that mean, exactly? Give a listen to James Brown’s 1968 hit “I Got the Feelin’”[2]. The song is rhythmically complex, with accents falling in unexpected places, and yet, moving to the beat is gratifyingly easy. However, in order for you to continue feeling the pulse while Brown and his band weave in and around the beat, significant cognitive and perceptual computations are required. Neural oscillations provide the means for those computations, and thus, subsequent hip shakin’. But, before delving deeper, let’s define some terminology at the intersection of music and brain science.
Musical Time and Rhythm
Time in music is hierarchical, with divisions broader and finer than the fundamental pulse at which you might tap your foot. In Western music, that fundamental pulse is called tempo. Tempo describes the rate of evenly spaced musical beats and is quantified in beats per minute (BPM). However, as scientists, we may prefer to describe tempo in Herz (oscillations per second) as a measure of frequency. If you venture into a Berlin club, the techno music you hear will likely have a tempo near 120 BPM. Described as a beat frequency, this tempo would be 2 Hz (120 BPM/60 sec = 2 beats per second) [3].
The hierarchical level above the beat is called meter, which describes the grouping of beats into a unit called a bar or measure. In popular Western music, a bar typically has 4 beats, creating a quadruple meter. A waltz is an example of a musical style in which 3 beats define a bar, creating a triple meter [4].
Tempo and meter provide a temporal framework against which we can perceive rhythm, which we’ll define as an expressive pattern of musical note onsets, durations and accents. The music of James Brown is full of excellent examples of rhythmic syncopation, where the musical content does not always fall on the beat [5]. While the familiar ‘4 on the floor’ of techno music clearly indicates each beat, in the case of “I Got the Feelin’”, some notes in the rhythm fall in-between beats, while other beats are left silent. Syncopation builds a sensation of rhythmic tension and, when wielded properly by Brown, et al., it creates a powerful urge to dance.
Entrainment to the Beat
If we were to look at “I Got the Feelin’” as a time-series of sonic amplitude events, we would see that musical sound does not occur on every beat, yet we feel the beat continuing unbroken as the band syncopates. We perceive a periodic pulse despite the fact that the music does not explicitly sound that pulse [9]. In order for a listener to feel the beat as a regular framework continuing steadily despite rhythmic variations in the song, recent music neuroscience research indicates that neural oscillations in the auditory pathway entrain to the beat. Entrainment describes the ability of ensembles of sensory neurons to adjust the phase or period of their oscillations in order to synchronize to a regularly repeating external stimulus [6]. In the case of our James Brown song, neurons in your auditory pathway adapt their cycles so that their oscillations align with the foot-tapping pulse.
Several lines of cognitive neuroscientific theory converge on the idea that by synchronizing the rhythmic activity of neurons in perceptual networks to the rhythmic activity of repeating external stimuli, organisms are able to improve their perceptual accuracy and predict the future activity of that stimulus [7, 8]. If oscillatory activity in a listener’s auditory networks is synchronized to a repeating musical beat, that listener is able to predict when the next beat will happen. Dancers, musicians and even passive listeners are not reacting to every beat with surprise. Rather, they can predict that the beat is coming, given the internal metronome provided by entrained neural oscillations.
Ongoing entrained oscillations allow the human brain to track a musical rhythmic stream and also provide a temporal framework against which syncopation and other complex rhythms can be judged. When James Brown’s band plays an unexpected phrase, the listener understands this rhythmic surprise as a variation against the internalized, ongoing temporal framework. Interestingly, this phenomenon is proposed to exist in the visual domain as well. When we look at a bistable percept like Rubin’s famous vase/face, where one might see a white vase against a black background or two black faces in profile against a white background, neural oscillations in the visual pathway code for figure vs. ground. Similarly, neural oscillations underlie the perception of a musical background (the pulse of the tempo) against which we perceive a musical figure (a syncopated rhythm or an expressive solo) [5, 6].
Neural Oscillations Track the Beat and Meter
In examining EEG data recorded from the brains of subjects exposed to rhythmic music, oscillations can be identified whose frequency corresponds to the tempo of the music they hear. If we examine the EEG time-series of a person listening passively (EEG is no good when you’re dancing) to that techno track at 120 BPM, we would see ongoing oscillations, stable in phase and period (called steady-state evoked potentials or SS-EPs), occurring at 2 Hz. Additionally, because musical meter is often marked by accentuation on the first note of the bar, we would observe a sub-harmonic of the beat frequency oscillation corresponding to the meter frequency; in the case of our quadruple meter techno song, 0.5 Hz [6].
Given the steady beat of techno, it’s easy to imagine cells in your auditory pathway firing every time the bass drum hits. However, in the case of Brown’s “I Got the Feelin’”, where not every beat is sounded and some notes happen in unexpected places, it becomes clear that entrained oscillations are the key to tracking the beat despite complexities and variations in rhythm. The repetition in Brown’s music reinforces a listener’s perception of the beat, but even in the case of improvised, non-repetitious jazz music, listeners’ entrained neural oscillations provide a guide through which the rhythmic acrobatics of the band can be understood.
By allowing a listener to infer the beat, entrained neural oscillations give James Brown the opportunity to create rich rhythmic complexity safe in the knowledge that his fans will not trip over themselves. And, for those who feel inherently un-rhythmic, there is hope. Musical training has been shown to increase the strength of neural entrainment to musical rhythm and to improve musical beat tracking [3]. Few people are born with the rhythmic aptitude of the Godfather of Soul, but practice can improve anyone’s beat keeping. In addition, the hierarchical nature of musical time and the non-linear dynamics of the neural oscillations involved in perceiving that time mean that it’s possible for different listeners to feel the beat at different harmonics of the fundamental pulse. So, leave any shyness behind you and take your brain for a workout on the dance floor. Let your oscillating neurons guide you and you’ll never lose the beat.

Steve's Master's Thesis
Neural entrainment is the process by which ensembles of neurons in sensory networks synchronize their oscillation in phase and period to that of rhythmic, repeating external stimuli. In music perception, evidence suggests that neural oscillations entrain to the frequencies of musical beat and meter. Polyrhythms are complex rhythmic structures in which two non-factorial rhythms co-occur over a common tempo. For example, if you walk down the street, your feet create a duple rhythm (based in two): left-right, left-right. If, while walking, you repeatedly count to three (a triple rhythm) in time with your footsteps, you will be the embodiment of a three-over-two (3:2) polyrhythm.
To investigate the effect of musical experience on entrainment strength to musical polyrhythm and to determine if stronger entrainment correlates with improved musical performance, Prof. Dr. Gabriel Curio and Dr. Gunnar Waterstraat of the Charité Neurophysics Group, Carola Bothe of the Freie Universität Department of Computer Science, and myself conducted an EEG study with a musical behavior performance task. While EEG was recorded, subjects listened to a 3:2 polyrhythm and were asked to complete either the duple or triple rhythm by striking an electronic drum pad when cued for that rhythm. We collected information on subjects’ musical background and compared this with their accuracy on the musical task and with EEG measures of neural entrainment to the polyrhythm stimulus.
We found strong relationships between musical experience and performance on the musical task. Increased musical experience correlated negatively with performance error and positively with performance consistency. Additionally, musical experience correlated positively with entrained oscillatory power at rhythm-related frequencies. A strong positive correlation was found between musical experience and entrainment at the first common harmonic of the duple and triple rhythm frequencies. This suggests that with musical training, the brain can increasingly track the separate components of a polyrhythm in an integrated manner, via oscillations entrained at frequencies that encompass both rhythms.
A significant relationship between entrainment strength and accuracy on the musical task was also present in the data. By employing spatial filtering algorithms adapted to the purpose by Dr. Waterstraat, a strong negative correlation was found between the power of oscillations entrained at the duple rhythm frequency and error on duple rhythm trials of the musical behavioral task.
The results of the study strongly suggest that by entraining oscillations in the auditory pathway to the components of musical rhythm, the brain gains accuracy in musical rhythm performance, and that musical training can increase the strength of this entrainment. Additionally, the work showed the value of interdisciplinary scientific collaboration, given the team’s skill sets which ranged from neurology and electrophysiology to advanced algorithmic coding to experience as a professional musician. Dr. Curio’s interest in leading a team with a diversity of perspectives and training created a very productive and collaborative environment for research, which I’m grateful to have been a part of.

Steve Garofano
Berlin School of Mind & Brain, M.Sc.
Neurophysics Group, Charité-Universitätsmedizin Berlin 

References:
1: Rouse et al., Front Neurosci 2016
2: Brown, J. (1968). I got the feelin’. On I got the feelin’ [LP]. Cincinnati, OH: King Records.
3: Levitin et al., Annu Rev Psychol 2017
4: Grahn, Top Cogn Sci 2012
5: Vuust, P. et al Neural underpinnings of music: the polyrhythmic brain. In Neurobiology of interval timing, 2014
6: Nozaradan, Philos Trans Roy Soc 2014
7: Jones and Boltz, Psychol Rev 1989
8: Lakatos et al., Science 2008
9: Large and Snyder, Ann NY Acad Sci 2009

June 25, 2018

It’s Not Just RoboCop, It’s Your Grandmother With a Pacemaker

Cyborgs are hybrid creatures, unsettling by nature. They have been used in the arts as symbols of both the progress and dangers brought by scientific discoveries. They are objects of both fascination and disgust. They are, therefore, often a great testimony to the preoccupations of the period and give us an idea of how our ancestors imagined the future of humanity, our present.

The word „cyborg“ is the contraction of cybernetic organism. For the purposes of this article, we will define it as an organic being with mechanic body parts. It applies to an organism that has restored function or enhanced abilities due to the integration of some artificial component or technology that relies on some sort of feedback.The term was created in 1960 by Manfred E.Clynes and Nathan S.Kline to refer to the enhanced man who could survive in extra-terrestrial environments [1]. But more on space exploration later…
First, we have to go back to the earliest occurrences of cyborgs in arts which date back from the mid-XIX century.

The steam concert by Grandville, Jean-Jacques, 1844 


Rust, Bone and Steam - the Early Cyborgs
The first visual appearance of cyborgs dates from 1844. Le concert à vapeur (The Steam concert) presents a band of musicians that have integrated their instruments into their bodies, and have steam coming from their heads. It’s a strange, humorous and an enticing idea if you ever had to carry around a heavy cello or a tuba [2].
The master of the fantastic, Edgar Allan Poe, also tackled the concept of cyborg in a beautifully written short story The Man That Was Used Up (1839). Obsessed by the physical perfection of the mysterious Brevet Brigadier General John A. B. C. Smith (including his moustache [3]), the protagonist runs around the city to learn his story, only to discover that the handsome officer is more a puzzle than a man. At the time of the colonial wars and the industrial revolution, Poe plays with the imagery of the strong warrior and questions the increasingly important place of mechanisation in everyday life.

A Pacifist Dystopia
In the aftermath of World War II, artists tried to think of solutions to ensure a lasting peace. In Limbo (published in 1952 but set in 1990), Bernard Wolfe imagines a world where humans try to suppress their aggressive impulses by performing voluntary amputations of their arms and legs. Unfortunately, the science of prosthetics progresses too, and the new limbs end up being better for war than the natural ones. The description of this society of limbless men extends to social, sexual and philosophical problems as well. This satire is also a severe critic of totalitarian thinking and acting [4].
In his 1963 novel The Three Stigmata of Palmer Eldritch, Philip K.Dick imagines a bleak future for humanity. In the year 2010, the earth has become so warm that one must carry an individual cooling system to venture outside during the day. Mars and a few other planets are inhabited by human colonists forced into exile by a draft. Their only escape from their life on a desolated planet is to chew a hallucinogenic drug: Can-D that “translates” them into a parody of the earth. Then Palmer Eldritch, an enigmatic cyborg space explorer believed to be dead comes back from a system far away. He brings with him an incredible new drug, whose potential to create hallucinated worlds goes far beyond what could be experienced with Can-D. However the nature of the drug and his motive soon appear to be very sinister. In this rich novel, the character of Palmer Eldritch is presented as a futuristic incarnation of the devil, coming to tempt and judge humans.
“The elevator arrived. The doors slid aside. Inside the elevator waited four men and two women, silently. All of them were Palmer Eldritch. Men and woman alike: artificial arm, stainless teeth… the gaunt, hollowed-out grey face with Jensen eyes.” [5]. 


https://bit.ly/2IFpT3c via pixabay

Welcome to the Era of the Macho-Cyborg
Of course, there are multiple examples of cyborgs in Sci-Fi television series such as Star Trek, Doctor Who as well as in the Marvel universe. Too many in fact, to present here. However, I feel I should mention the The Six Million Dollar Man which aired between 1973 and 1978 as it remains a reference in popular culture. Steve Austin, former astronaut with bionic arms is a friendly cyborg, who looks completely human most of the time and works for the US government. Contrarily to earlier examples of cyborgs, there is nothing scary about him.
Austin was soon followed by another righteous cyborg: Robocop. In Paul Verhoeven’s movie of the same title, (1987), RoboCop is a police officer killed by a gang of criminals and later revived as a powerful cyborg who violently fights crime and corruption. However my favorite “cyborg civil servant” character has to be Inspector Gadget. I grew up watching the excellent animated series (which originally aired between 1982 and 1986), which has the best theme song ever. I dare you not to immediately google it. Of course, there is also a 1999 movie directed by David Kellogg.
These three different characters are examples of the late 70’s-80’s concept of a cyborg as an improved man, ultra-masculine without the satire of earlier depictions (from E.A. Poe or B. Wolfe). Moreover, the cyborg is not seen as a threat anymore, but as the next step in the human evolution, helping to solve the problems of our society.

Entering the Cyberspace
Then, there are the cyborgs whose bodies are entirely artificial. Let’s talk about Ghost in the Shell. Before the recent Hollywood adaptation, Ghost in the Shell was a manga from Masamune Shirow published in 1989 [6]. The protagonist Motoko Kusanagi is a woman whose mind (her “ghost”) now lives inside and artificial body (the “shell”). Here, the body and the mind are presented as two different entities. Indeed Motoko is able to leave her body and enter the cyberspace to hack computer systems or find other “ghosts”.
Motoko: “Just as there are many parts needed to make a human a human, there’s a remarkable number of things needed to make an individual what they are. A face to distinguish yourself from others. A voice you aren’t aware of yourself. The hand you see when you awaken. The memories of childhood, the feelings for the future. That’s not all. There’s the expanse of the data net my cyber-brain can access. All of that goes into making me what l am. Giving rise to a consciousness that l call “me”.”
This duality has been perhaps best introduced by William Gibson in his iconic novel Neuromancer. In this 1984 (so pre-internet) thriller, Gibson develops the concept of cyberspace (a term he invented). In this world, most people are cyborgs. However, implants are not enough to render life bearable, and when the protagonist is not able to enter cyberspace, because his nervous system is damaged, he feels trapped.
“For case, who’d lived for the bodiless exultation of cyberspace, it was the Fall. […] Case fell into the prison of his own flesh.”[7]
I cannot conclude this article without a few honorable mentions to pop culture characters who are cyborgs, even though you might not see them that way: Darth Vader, Edward Scissorhands but NOT the Terminator (despite what he says repeatedly during the movie, he is a robot – not a cyborg).

https://bit.ly/2DLDrH6 via pexels


When Reality meets Science Fiction
To misquote Gray, Mentor & Figueroa-Sarriera, science-fiction writers and the editors of the Cyborg Handbook [8]: "It’s not just Robocop, it is (y)our grandmother with a pacemaker“.
Today, cyborgs are living among us. A British man -Niel Harbisson- is the first official human cyborg. Born colour-blind, Niel Harbisson had an antenna implanted into his occipital bone in 2004. It allows him to hear colours including infrared and ultraviolet. The antenna detects the wavelenghts of the colour in front of him and produces a sound that he hears through bone-conduction [9]. Since the antenna now appears on his passport, he has been recognized in 2018 has the first human cyborg by the Guinness Book of World Records.
Although the term cyborg now officially also refers to a real being, reality has yet to catch up with fiction. Artists have been dreaming up cyborgs for over 170 years. In addition to being highly entertaining, science fiction novels and movies inform and alert us to the shifting of our society by imagining worlds with different possibilities for the evolution of humanity.

Aliénor Ragot
PhD Student, AG Holtkamp

[1] Clynes and Kline, Astronautics 1960.
[2] Grandville and Taxile Delord, Un Autre monde : transformations, visions, incarnations, ascensions.1844
[3] "You perceive I cannot speak of these latter without enthusiasm; it is not too much to say that they were the handsomest pair of whiskers under the sun", The Man That Was Used Up p.361

[4]  Anonymous, Psychiatr Q, 1953
[5]  The Three Stigmata of Palmer Eldritch p. 171
[6] Shirow, Ghost in the Shell, 1989 (manga); Oshii 1995,2004,2008 (animated films); Sanders 2017 (live-action film)
[7] Gibson, Neuromancer, 1984
[8] Gray, Mentor & Figueroa-Sarriera, The Cyborg Handbook, 1995 p. 2
[9] https://bit.ly/2L5nyjA

June 22, 2018

“How Did That Get There?” - Foreign Objects in the Brain

The brain is our inner sanctum, containing every experience of consciousness, sensation, and memory. It’s cradled in the skull, meninges and dura mater, as well as aggressively defended by a special wing of the immune system. There is no way that anything should get in (or out). However, because of this very organ, humans are ingenious creatures and always can find ways to do incredibly stupid things. This article is not for the faint of heart.

“But I saw it on the internet….”
As I was researching this article, I quickly learned (to slight disappointment) that many stories entitled ‘Man or Woman had XYZ in Brain’ had pretty dodgy concepts of anatomy. We will start with the obvious: the brain is encased in the skull, and has no immediate access to the outside world. Some routes may be shorter than others, like the thin sheet of bone above the nasal cavity, but without blunt trauma they’re tightly sealed.
For example (apologies in advance to all arachnophobes), spiders can very easily wander into or lay eggs in your ears. However, to make it to the brain, they would have to make a hole in the eardrum, wander all the way through the cochlea and somehow squeeze their way along the auditory nerve. So the stories that you may hear about referring to ant colonies growing in an individual’s brain? Undoubtedly fake.

Misplaced Objects
As one might suspect, a lot of foreign objects find their way into the brain via tragic or unfortunate circumstances. For example, two Turkish neurologists reported a seemingly normal man who was discovered late in life to have three needles in his brain. His only symptom? A few headaches. The authors suspected that the needles were from a failed homicide attempt when the man was an infant. He survived the removal surgery, and presumably went on with his life [1].
As with other types of surgeries, it is also possible for tools, gauze, or other implements to be left behind in the brain. Tissue grows around these foreign bodies, and infection or abscesses may occur if not caught in time (usually by neurological symptoms). Most (but not all!) of the time these remnants can be removed, and one may assume that a malpractice lawsuit soon follows.
One of the more famous examples of foreign objects in the brain came from a man named Dante Autullo in 2012 [2]. He was using a nail gun to fix a shed, and somehow misfired it into his own head. His friend cleaned up what looked like a surface wound, and they continued building. The next day Autullo was feeling slightly nauseous, and a trip to the doctor ended with him getting a nail removed from his brain. He survived without further impairment. Compare this with the very famous case of railroad worker Phineas Gage, who took an iron rod to the head. He, too, survived without neurological symptoms, but his peers noticed a massive change in personality until he died twelve years after the accident [4].


Radiology Picture of the Day, Courtesy Dr. Laughlin Dawes. Pictured is a CT scan of a patient who pushed a ballpoint pen through the eye socket (and survived).


Squirm-Inducing
Apart from trauma, there is a more insidious and infinitely more gross way that things can get into your brain: parasites. For example, the pork tapeworm Taenia solium can enter the bloodstream from poorly-cooked food, and invade the skin, eyes, and brain. The danger here is not acute, but rather builds up over time as the body forms cysts trying to isolate the parasite from brain tissue [3]. Depending on where these cysts develop, this can cause seizures or other neurological complications. In developing countries, it is estimated that up to a relatively high percentage of the general population is a carrier of the tapeworm, but the fraction of those burden with neurological problems is unknown. Treatment of these brain invaders is also complicated, as the most common antiparasitic drugs cause severe tissue inflammation, which in the brain can be fatal. Often, surgical excision of the cysts is the only option [5].

The brain lesion was... moving.

But for the maximum gross-out factor, let’s also spend a moment talking about larger critters. In 2010, a British man visited the doctor complaining of headaches, strange smells, pain and other vague symptoms. He was tested for just about every neurological disease under the sun, including dozens of brain scans, but doctors couldn’t figure out what was wrong. It was only when a neurologist compared the scans over time that they noticed that a small brain lesion was… moving. Yes. He had a living worm in his brain. It turned out to be a rare type of frog parasite, and was genotyped in the hope of better diagnosing infected patients in the future [6]. Cases like his are (thankfully!) extremely rare, but do still pop up from time to time...

No Entry
In short, having anything in your brain besides neurons, glia and supportive cells is bad news. It takes a keen eye and a good neurologist to spot and deal with the problem, and all of us are better off avoiding these situations all together. So, keep pointy objects away from your head and cook your food well.
Oh, and don’t believe everything that you read online.

Constance Holman
PhD Student, AG Schmitz

[1] Pelin and Kaner, Neurol Int 2012
[2] https://bit.ly/2jNy3vL
[3] https://bit.ly/2rxLsvq
[4] https://bit.ly/2L4vhPc
[5] White, J Infect Dis 2009
[6] Bennett et al., Genome Biol 2014



Like what you see? Interested in contributing? We are always looking for new authors and submission on anything related to the topic of (neuro)science. Pitch us an article, or send us some beautiful shots from your microscope, poems to claudia.willmes@charite.de!   

June 18, 2018

Something Fishy Going On - The Impact of Nanoplastics on the Behaviour of Fish

With the enormous rise in plastic use and production within the last century, we are now coming to terms with the impact plastics are having on our environment.

As a result, an expanding area of research that is the study of how microscopic plastic particles less than 100nm in size, referred to as “nanoplastics” (NPs), could potentially be the most hazardous form of marine litter [1]. Due to their small size and therefore high surface area, toxic chemicals can be retained by NPs and could lead to an accumulation of these toxins in marine organisms once the NP pass through membranes and into cells [2].

Krill, Source: Wikimedia Commons

With the ability of NPs to cross from the blood and into the brain, plastic deposits were found in the brains of the tiny crustaceans Daphnia magna, which were kept in water enriched with nano- and micro- sized particles [3]. The direct effect of these NPs can be seen in the behaviour of Daphnia: those who consumed the plastics showed a dose-dependent higher incidence of death; at lower, less fatal concentrations, NPs slowed the eating and preying behaviours [4].
Another question that researchers are investigating is whether NPs can be passed up the food chain as smaller creatures like Daphnia are eaten in large numbers by bigger fish. It has been found that NPs administered to algae become accumulated in fish and have a profound effect on their eating and shoaling behaviour, as well as changes in their metabolism [5].

Can plastic nanoparticles alter behaviour?

It is important to note that the occurrence of NPs in the natural marine environment has not yet been proven, and the above mentioned studies used much higher concentrations that would be found in nature [1]. However, these results can be seen as a wakeup call to start thinking of the impact our plastic use has on the environment.

Joanne Falck
PhD Student, AG Garner

[1] Koelmans et al., Marine anthropogenic litter, 2015
[2] Velzeboer et al., Environ. Sci. Technol. 2014
[2] Mattsson et al., Scientific Reports, 2017
[3] Mattsson et al., Environ. Sci. Technol, 2015
[4] Mattsson et al., Environ. Sci. Technol, 2014

Like what you see? Interested in contributing? We are always looking for new authors and submission on anything related to the topic of (neuro)science. Pitch us an article, or send us some beautiful shots from your microscope, poems to claudia.willmes@charite.de!   


June 15, 2018

Sound or Silence? The Pros and Cons of Cochlear Implants

Cochlear implants (CI) have been in use for several decades, yet there is still an active controversy surrounding these devices. While some people strongly advocate for the positive effect on an individual’s life, others claim that implants are dangerous both to individual health and to deaf culture at large [1].

PRO
Even though a CI cannot provide 100% hearing capability, it enables the individual to hear and understand most sounds. A cochlear implant does not amplify sounds like common external hearing aids, but stimulates the auditory nerve. The implant essentially replaces the function of the hair cells in the inner ear that usually register sound vibrations.  
Most importantly, a CI enables the individual to hear itself and thus learn to speak and articulate, and thus an implant helps to communicate with hearing people who do not know sign language. It also saves from  lip reading and generally depending on others for hearing help.
Another great advantage of having a CI for young people is that it can help them fully participate in mainstream schools and society, as well as broaden their career choices. While they may still be limited, the limitation won’t be as severe or as disabling as if they had no hearing choices available to them.
Being able to hear is also a measure of safety: the ability to locate sounds allows you to be more aware of perilous situations and hear impending danger such as a car coming from behind.

Blausen.com staff (2014). "Medical gallery of Blausen Medical 2014". WikiJournal of Medicine 1 (2). DOI:10.15347/wjm/2014.010. ISSN 2002-4436. - Own work, CC BY 3.0, https://commons.wikimedia.org/w/index.php?curid=29025007


CONTRA
CI require a surgical insertion and obviously surgery of itself always bears risks. Since the device has become available, risks have been minimized, however, complications may include occasional facial numbness or minor facial paralysis. Among individuals wearing a CI, there is also higher incidence of bacterial meningitis than for the general population. Thus an immunization is recommended. The body may also reject the implant, which could require removal or further surgery.
It is important to keep in mind that a CI doesn’t guarantee that a person will be able to hear and speak at a normal level. In some cases, the person with the implant can only hear some environmental sounds. Particularly for adults who receive the implant, electronic signals might not register fully and some hearing impairment may still occur. Also it is important to acknowledge that the auditory cortex is not used to process sound in the same way if a person has been deaf for a long time, so a CI wouldn't help much. 
Thus parents of hearing disabled children are urged  to make a decision as soon as possible for their child. Most people with an implant still need special help in learning to speak and in many cases they will still be stigmatized. Even though they can hear and speak, their hearing capabilities are not the same as a hearing person's [2]. Luckily, the devices are getting better and improved sound perception lets the wearer of CI integrate better into mainstream society.
Obviously, it takes time to get used to the implant and especially in the beginning, many need to get the CI reprogrammed according to their needs. Also, people with CI are limited on some physical activities, especially those involving contact with water, as this could damage the implant.

Controversy in Deaf Society
The primary controversy regarding CI concerns the definition of deafness as a disability. Recently journalist Enno Park gave a talk at the Berlin re:publica conference , where he spoke about his very personal view on the two (hearing and non-hearing) sides of society [3].
The medical community generally regards deafness as a disability that should be treated, and mainstream (hearing) society is of the opinion that hearing allows for a more fulfilling life. Meanwhile, many individuals who are deaf, as well as others who are familiar with non-hearing society feel that deafness is a cultural identity rather than a disability [4]. As a result, they feel that CI implies that there is something wrong with them that needs to be fixed, and that living as a person with a hearing impairment is inherently less fulfilling than as a "normal" person. Thus, these members of the community perceive putting something technical in their brains as serious affront.
Within the non-hearing community, there is a long history of disagreement: Some signing people feel that CI wearers are betraying their culture. Some even go as far as to describe implantation of children as “child abuse”. Also, with more people wearing CI, the need for sign-language interpreters decreases to the disadvantage of those who still rely on their services. On the other hand, parents that do not want their children implanted have to deal with hostility not only from the hearing community, but also from CI-wearers.




--> Get an idea how sound is perceived, through a CI! <--




Interestingly, the attitude towards sign language differs from country to country. In Germany and France, most people working in deaf education even don't speak sign language, and emphasis is placed on children learning to lip read [2,3]. In contrast, in the US, every police department should have one or more interpreters available on call [5]. It is likely that these attitudes and values will change over time, both with the rise of improving hearing technology and activism from within the deaf community.
The decision to receive a CI is a very personal one that should be considered with the help of a medical professional. Every parent with a hearing disabled child needs to decide for themselves what the best choice for their situation is.

Claudia Willmes
PhD Alumna, AG Eickholt / AG Schmitz

Background:
From 2007 to 2008 I worked at the Institute le Bruckhof in Strasbourg, France -  an institution for hearing disabled children, where many children wore cochlear implants [6]. They received special training in learning language and were encouraged to read lips instead of using sign language. However, I also took classes in an adult education center to learn DGS (German sign language) for one year and attended the university in Strasbourg to learn LSF (French sign language). Thus, I heard opinions from all sides: the hearing, the hearing disabled and signing, and those wearing CI.

[1] www.nad.org
[2] personal communication
[3] talk by Enno Park https://bit.ly/2K3MO8P
[4] Ohio University, The Institute for Applied & Professional Ethics https://bit.ly/2wxdrBg
[5] U.S. Department of Justice  https://www.ada.gov/q%26a_law.htm
[6] www.bruckhof.org/
 

Like what you see? Interested in contributing? We are always looking for new authors and submission on anything related to the topic of (neuro)science. Pitch us an article, or send us some beautiful shots from your microscope, poems to claudia.willmes@charite.de!  


June 14, 2018

Retinal Implants

Science fiction movies or novels often pick cybernetic transformations of the human body as a "must-have" for the future. The idea of cybernatically-engineered eyes seems to be very popular.
 
Who would not appreciate a contact lens that connects to the internet and shows you all the information in real time like in the TV series Altered Carbon [1]? Nevertheless, some people have retinal implants or bionic eyes, because they cannot see without them. For example, individuals with neuronal degeneration of the retina have the possibility to get retinal implants to restore some of their vision. There are two diseases that cause the majority of retinal degeneration: retinitis pigmentosa (RP) and age related macular degeneration (ARMD).

Retinal Implants Can Restore Basic Greyscale Vision
RP is caused by mutations in more than 50 different genes, and thus can be inherited. Here, the translation of light to electrical signals by the rod photoreceptor cells is disrupted. The symptoms develop gradually, starting in most cases start with night blindness and loss of the peripheral vision. The symptoms affect both eyes equally. The disease progresses to affect the peripheral visual field and then the central area. Fortunately, only a few patients become completely blind [2].
In contrast to RP, in ARMD the central visual field is affected first. Macular degeneration mostly affects older people and does not lead to complete blindness. Nevertheless, loss of vision in the central visual field greatly affects the quality of life [3].
Fortunately, we live in the 21st century, so our research on biomedical devices is already quite advanced. Now, patients have the possibility to get retinal implants. These implants mostly work with glasses and a build-in external camera. The camera sends the information to a mini computer. Here the information is processed and send to a retinal implant via cable (see figure). The implant electrically stimulates the nerve fibres similarly to functioning photoreceptors. The current implants are able to restore basic greyscale light perception, so that patients are even able to identify objects [4].

Source: https://bit.ly/2I7HRyk

The Bionic Eye
Argus II, also known as the "Bionic Eye", is the oldest model with an external camera. It has been approved for clinical application in Europe and the US. This device helps patients see patterns of light, which they learn to interpret in time [5].
Similar to the Argus II is the Boston Retinal Implant. The only difference is the amount of electrodes in the retinal implant. The Boston Retinal Implant has a higher resolution due to higher number of electrodes. This model is still being tested in animals and not yet in clinical studies [6].
The Intelligent Medical Implant (IMI) is also comparable to the Argus II. The difference and advantage to the Argus II is that the IMI can be individually calibrated upon implantation. This allows adjustment of visual perception for each patient. The IMI is not available for clinical applications yet, but is in clinical trial in Europe [7].
One disadvantage of all three models are the transscleral cables that connect the outside electronics with the retinal implant. Here, the risk of infections due to long term implantation is higher as for example with a different model, the Epi-Ret 3. This model is also a combination of an external camera and a retinal implant, but there is no external case or transscleral cables. The information is send wirelessly to the retinal implant, where it is translated to electrical stimulation. This is an advantage due to the missing transscleral cables, eliminating problems like infections or other postoperative complications. Epi-ret 3 is currently in clinical trials in Europe [8].

Getting Closer to Natural Vision
A very different model to the ones mentioned so far is the Alpha-IMS. The Alpha-IMS works without external cameras but instead has a retinal implant that consists of a 1500-pixel multiphotodiode array mimicing thereby very closely the natural vision. The light sensitive photodiode responds to light and sends a signal to a subdermal power control. This power control can be charged wirelessly by a handheld device that also adjusts the light sensitivity [9]. The model is currently under clinical trial in Europe and Hong Kong. Unfortunately, there have been cases with corrosion of the implant and subretinal bleeding [10, 11].
In 2014 a review analysed these devices and looked at the visual performance in patients [4]. They showed that the Alpha-IMS and Argus II have the best results, when it comes to visual performance. The best recorded visual acuity to date is 20/1260 of the Argus II [12]. Compared to perfect 20/20 vision, patients remain nearly blind, but they are able to see significantly more than without the implants.
The Alpha-IMS may have a better resolution in a smaller more focused central vision, but the Argus II has a bigger field of view. Similar to the Argus II, the Boston Retinal Implant has an array with more electrodes, and in theory this would give a higher resolution. However, there is still no data to back this up. In addition, these larger devices might create more heat.
In conclusion, there are good options for people with retinal degeneration. Though the presented retinal implants still have limited resolution and biocompatibility, future research will help improving these features. Unfortunately, for the privileged people with intact vision, retinal implants with a wifi connection (still) remain science fiction.

Larissa Kraus
PhD Student, AG Holtkamp


[1] Altered Carbon, Netflix, 2018
[2] https://bit.ly/2r9gZ5C
[3] https://bit.ly/1dBK8yC   
[4] Chuang et al., Br. J. Ophthalmol, 2014

[5] https://bit.ly/1PwBZGY
[6] Rizzo, J. Neuro-Ophthalmology, 2011
[7] Keserü et al., Acta Ophthalmol, 2012
[8] Klauke et al., Investig. Opthalmology Vis, 2011
[9] Kusnyerik et al., Investig. Opthalmology Vis 2012
[10] Stingl et al., Proc. R. Soc. B Biol. Sci., 2013
[11] Zrenner, Sci. Transl. Med., 2013
[12] Humayun  et al., Ophthalmology, 2012

Like what you see? Interested in contributing? We are always looking for new authors and submission on anything related to the topic of (neuro)science. Pitch us an article, or send us some beautiful shots from your microscope, poems to claudia.willmes@charite.de!  

June 11, 2018

Futuristic Brain Implants: Review of the Series "Black Mirror"

Black Mirror is a  British Sci-Fi series created by Charlie Booker which explores diverse day-to-day topics from our obsession with social media (Nosedive, Series 3 Episode 1) to things which we can't even imagine, like placing a person's consciousness in a physical object (Black MuseumSeries 4  Episode 6)! They often delve into neuroscience in the not-so-distant future, in particular, the ways in which different sorts of technology change the way that we think and behave.
 
Spoilers ahead!


The episode The Entire History of You (Series 1 Episode 3) is set in an alternate reality where people can have a 'grain' implanted to record every moment of their lives and replay memories on a screen whenever they want. However, replaying memories becomes a nightmare, such as for a man who finds out about his wife's affair! Perhaps forgetting is blessing in disguise! I found the episode Crocodile (Series 4 Episode 3) thrilling, and I am sure this will keep you at the edge of your seat! The main character witnesses a road accident and is interviewed by the insurance company as a witness. During the witness interrogation, the investigator uses a memory recaller device to look at the recent memories of her interview partner. As the main character had previously committed a crime, she fears it will be revealed when the recaller scans her memories, so she starts a string of murders to cover up her crime. The best part is the climax-  with the police turning to a pet guinea pig as witness!

 Welcome to neuroscience in the not-so-distant future

The series has made me thinking about aspects of neuroscience, and there are still many unanswered questions. Could the guinea pig or other animals have better memory recall or intelligence than us, human beings? Or could a memory recall device be feasible in the near future? (It would be great to have, especially to quote papers during lab meetings that you have read and forgotten!) Of course, this would necessitate that we first understand how memory is encoded or replayed. Having consciousness transferred to a physical object poses also the big question: Where is consciousness located in the first place? The reality is that we are far from understanding these problems. This is why we have an exciting time ahead of us to learn more about the brain and invent futuristic devices. For those who have not seen Black Mirror, I would definitely recommend it.

Aarti Swaminathan
PhD Student, AG Schmitz





This article originally appeared June 2018 in CNS Volume 11, Issue 2, Brain Invasion