Penn Arts & Sciences Logo

CNI Opportunities

Postdoctoral Fellow, Otorhinolaryngology/Physics/CNI

Applications are invited to fill a NIH-funded post-doctoral position at the University of Pennsylvania in the Departments of Otorhinolaryngology (Hearing Science Center; Yale Cohen) and Physics (Vijay Balasubramanian) and the Computational Neuroscience Initiative (Vijay Balasubramanian and Yale Cohen). This project focuses on identifying the neural and computational basis of auditory-object perception in Old World monkeys. The work combines auditory psychophysics, large-scale neuronal recordings, and computational theory. Candidates are expected to have a PhD (or equivalent) related to neuroscience, engineering, physics, psychology, or similar disciplines. Candidates with backgrounds in computational theory are especially encouraged to apply. Preferred candidates will have experience in training Old World monkeys on behavioral tasks, recording neuronal activity while monkeys are engaged in behavior, and using computational approaches to analyze the data. The intellectual environment of the University of Pennsylvania is outstanding for computational auditory neuroscience. The Hearing Sciences Center, which is directed by Cohen, is a multi-investigator group that identifies neuronal correlates of hearing and communication at multiple scales of interrogation, using state-of-the-art computational and causal techniques. The Computational Neuroscience Initiative, which is directed by Balasubramanian, gathers together the Penn faculty who are fundamentally interested in systems and computational approaches to neuroscience. The CNI is the central home for research in theoretical and computational neuroscience at Penn. Candidates will have access and mentorship from faculty in both the Hearing Sciences Center and the Computational Neuroscience Initiative. This will be a full-time, 12-month renewable appointment. Salary will be commensurate with experience and consistent with NIH NRSA stipends. To apply, send your CV along with contact information for 2 referees to: Applications will be considered on a rolling basis, and we anticipate a Fall 2019 start date.


Postdoctoral positions in Systems/Computational Neuroscience at the University of Pennsylvania

The Geffen Laboratory of Auditory Coding at the University of Pennsylvania has an opening for experimental and computational neuroscience postdoctoral fellow positions. We are looking for energetic, talented, friendly, enthusiastic and creative researchers who have recently completed or are about to complete their PhD.

Qualifications: experience in systems, computational or molecular neuroscience methods; excellent quantitative skills and proficiency or willingness to learn scientific programming, such as Matlab or python.

Our laboratory seeks to deepen our understanding of neuronal circuits and dynamics of neuronal populations contributing to auditory perception and learning. We use state-of-the-art methods including two-photon imaging of Calcium activity in behaving, head-fixed rodents, large-scale electrophysiological chronic and acute recordings, optogenetic, pharmacological and behavioral techniques, in combination with computational modeling and theoretical work. Projects underway in the laboratory include: investigating intra-cortical and subcortical circuits supporting auditory adaptation to statistical regularities of sounds; identifying neuronal circuits for hearing in the presence of noise and in complex acoustic environments; unraveling neuronal circuits controlling learning-driven changes in auditory perception.

The Geffen Laboratory is a creative, productive and collaborative group, and the postdoctoral fellows will have many opportunities to interact and collaborate with members of our and other laboratories across the Penn campus and beyond. We are part of the Hearing Sciences Center, the Computational Neuroscience Initiative, and the Mahoney Institute of Neurosciences .

To apply, please send a CV along with a cover letter briefly summarizing your research interests, and PDFs of recent publications, to: