Dual Channel Constraints and Natural Type Inference
02/11/2021, 5pm
Speaker
Abstract
Code mixes natural language in identifier names, comments, and stylistic choices (ordering and typesetting) with a formal language that defines a computation. The snippets in each language form a communication channel. Developers read both channels; a CPU processes only the formal channel. These two channels interact and constrain each other. The theory of dual channel constraints elucidates these interactions and points to their exploitation. One prominent application is probabilistic type inference. In an optionally typed language, developers can add type annotation to find local type errors and to provide signposts to their development environment to facilitate navigation, refactoring, completion, and documentation. Natural type inference (NTI) reformulates probabilistic type inference as an optimisation problem: it combines hard logical constraints from the formal channel with soft natural constraints from the natural channel to soundly infer types. Most work, including NTI, assumes the two channels are in sync. I will close with outlining work that solves problems that arise when theyare not.
Speaker Bio
Earl Barr is a professor of software engineering at the University College London. He received his PhD at UC Davis in 2009. Earl's research interests include machine learning for software engineering (and vice versa), debugging, testing and analysis, game theory, and computer security. His recent work focuses on probabilistically quantifying program equivalence, probabilistic type inference, and dual channel constraints. With then execution of a current, pandemic-imposed hiatus, Earl dodges vans and taxis on his bike commute in London.