/* Compute what is computable and make computable what is not so */

Jared Sylvester

About Me

I'm a consultant/machine learning researcher/data scientist at Booz Allen Hamilton. In 2014 I got my doctorate in Computer Science from the University of Maryland, where I focused on the field of biologically-inspired AI. My dissertation was on executive function and cognitive control (working memory, decision making, etc.) using neural-inspired systems rather than rule-based ones — essentially trying to get computers to act in ways that are a little more like our brains and also get artificial neural networks to act a little more like traditional computers.

I also did a lot of work at the Center for Complexity in Business, applying data-driven computational techniques to marketing, finance and other business domains. I've also worked in fields including data mining, biometrics, circuit design, cognitive psychology, finance, social networks, and marketing.

Donald Knuth said “Science is what we understand well enough to explain to a computer. Art is everything else we do.”* This is a moving boundary, and I'm interested in the application of algorithmic techniques to liminal fields on both sides of the frontier.

My non-scientific interests include algorithmic animation (I've posted some of my work here), calligraphy (ditto), and baking bread (sadly it's a little tough to put my output from this hobby online). I've also been trying to teach myself some woodworking and archery.

I live in Maryland with my wife, two toddlers, and a Westie.

C.V. / Résumé

(Last updated December, 2020.)

Please note that my resume requires pre-publication review before being revised. This is an extremely bureaucratic and drawn-out process, so while the list of publications here has been updated, my description of my current job role is about two years out of date.

Research

(Everything I have written below refers to work I did in grad school. I should really get around to writing a synopsis of the work I've done since.)

Dissertation

For my dissertation I worked with Jim Reggia on exploring neural models of cognitive control. Most cognitive control models are built using symbolic, rule-based paradigms. Such systems are both biologically implausible and often tend towards the homuncular. What neural models do exist are typically very narrowly designed for a particular task and require a great deal of human intervention to tailor them to the objective as well as exhibiting problems scaling to larger problem spaces.

I am exploring creating a more generalizable model of cognitive control using a neural paradigm by creating networks which learn not only memories of environmental stimuli but also the steps necessary for completing the task. The steps are stored in a memory formed by a sequential attractor network I developed so that they can be visited in order. I call my model GALIS, for "Gated Attractor Learning Instruction Sequences."

By generating behavior from the learned contents of a memory rather than the explicit structure of the network itself I believe it will be much easier for the model's behavior to change. Rather than having to rebuild the ‘hardware’ of the network, you can instead load different ‘software’ by training the memory on different patterns. Furthermore, making the model's behavior readily mutable opens the door to it improving its performance as it gains experience. That, in turn, should allow the model to learn the behavior necessary to completing a task on its own.

Basing behavior on memory contents rather than architecture is not unlike the shift from clockwork automata like Vaucanson's ‘Digesting Duck’ to the Jacquard Loom. The latter was an important step in the history of computation because its behavior could be changed simply by swapping in a different set of punchcards — i.e., by changing the contents of its memory. Of course GALIS surpasses the Jacquard loom because the loom was only able to follow instructions, not conduct any computation of its own. GALIS, on the other hand, determines endogenously when and how to modify its working memory, produce outputs, etc.

Business & Social Network Analysis

In addition to my dissertation research I'm also working with Bill Rand in UMD's Smith School of Business's Center for Complexity in Business. I'm working on a couple of projects, but the main one for me is an effort to model social interactions in an MMORPG with a freemium business model. Our goal is to be able to model who will convert from free to a paid user based on there location in the in-game social graph and the characteristics of them and their friends. We're using a variety of techniques, including agent-based modeling, logistic regressions and assorted machine learning techniques.

Other neural networks research

Prior to GALIS I worked with Jim on two other projects. The first is a computational model of working memory formation. This was done in conjunction with a wide-ranging study at UMD's Center for Advanced Study of Language into the role of working memory in language tasks. This study of working memory lead into the cognitive control research I am doing now. I have also used machine learning methods to analyze the results of some CASL studies to see if it is possible to determine who will benefit from working memory training based on pre-test results. Please see the 2011 tech report below for more.

The second project, begun in Spring 2007, deals with symmetries in topographic Self-Organizing Maps. By limiting the radius of competition and choosing multiple winners for standard Hebbian learning we can generate cortices with global patterns of symmetric maps. Please see the 2009 Neural Computation paper below for details.

Undergrad

At Notre Dame I did Machine Learning research. I worked on creating and testing a system called EVEN, for ‘Evolutionary Ensembles.’ It is a genetic algorithm framework for combining multiple classifiers for machine learning and data mining. It is very flexible, with the ability to combine any type of base classifiers using different fitness metrics. This work was done with Nitesh Chawla, who advised me for my final two years at Notre Dame.

Publications

Journals

Conferences

Reports, working papers, etc.

Other Talks