Conor Houghton
     

Conor Houghton

Conor Houghton

Contact

conor.houghton@bristol.ac.uk

School of Engineering Mathematics and Technology
University of Bristol
Michael Ventris Building
Woodland Road
Bristol
BS8 1UB
England

official home page.


Papers


The latest news!

(2024-11-27) More paper news; it's been a good week:

GlyT2-positive interneurons regulate timing and variability of information transfer in a cerebellar-behavioural loop.
Ensor Palacios, Conor J Houghton and Paul Chadderton
biorxiv.org

has been accepted by the Journal of Neuroscience.

Our paper reports on complicated experiments by Ensor in Paul's lab to understand the behavioural role of the large inhibitory cells in the cerebellar cortex, more specifically, the way these cells control the representation of whisker movement in the cerebellum. Using fancy methods the activity of the inhibitory cells was reduced and the neural activity in cerebellum was recorded.

The results were subtle. There is a lesson here; the large inhibitory cells, the Golgi and Lugario cells, are clearly metabolically costly, but their role is not as obvious as that would lead you to expect. The brain clearly relies on what we imagine to be very subtle modulation of activity and is willing to pay for it in energy!

In any case, using computational methods as fancy as the experimental methods, it was possible to tease this apart and show that down-regulation of Golgi and Lugaro cells has very little influence on sensorimotor representations in the cerebellum but instead modulates the timing of cortical population activity. In short, the inhibitory cells control when information is represented, not what information is represented. Our results indicate that inhibitory interneurons in the granule cell layer are necessary to appropriately pace changes in cerebellar activity to match ongoing behaviour.


(2024-11-29)Postdoc position

Postdoc position in Bristol in a animal behaviour project lead by the great Emma Robinson, I'm a co-I and we're looking for a computational person keen to do experiments or an experimental person keen to learn some modelling, deadline 2024-12-17.

bristol.ac.uk/jobs


(2024-11-20)Paper news:

We have just put a revised version of

An iterated learning model of language change that mixes supervised and unsupervised learning
Jack Bunyan, Seth Bullock, Conor Houghton

up on arXiv:

arxiv: 2405.20818

The good news is that the paper is now in press with PLoS Complex Systems, which I am super excited about. This is the paper I think of as "the big ILM paper" and it is great we got it into a good journal, I really keen for people to read it and the venue helps with that!

The ILM, or iterated learning model, was used by Simon Kirby and his co-workers to model the way languages evolve; the "languages" in the model are comically simple, maps from vectors of zeros and ones representing "meanings" to vectors of zeros and ones representing "signals". Simple as they are, we can nonetheless ask how expressive and composition one of these languages is.

In the ILM a tutor teaches a pupil a language using a limited random set of examples; the pupil then become a tutor itself and teaches a new pupil. Since it only learned a subset of the language it is forced to generalize to produce its own examples. The interesting thing is that the language evolves expressivity and compositionality through the resulting cycles of generalization and subselection

The original ILM included an unrealistic step, basically to make a map that goes both ways, mapping meanings to signals and signals to meaning, the pupil needed to do a sort of inversion. It is as if a child stayed silent until 12, then imagined everything they could ever possibly say and then did some complicated matrix operation in their heads before starting to speak. We found a way of removing this step, but to do that, we need the pupil to cogitate on meanings, to sharpen their understanding of the language they are learning by mapping meanings into signals, then mapping the signals back to meanings to check the output meaning was the same as the input meaning.

My new belief is that this is the crucial step for the appearance of language. The hypothesis is that at some point in our evolutionary past we internalized our utterance and from that unfolded language.

Anyway I have a short talk about it here:

youtube


Old news