Mark Musil’s Previous Recommendations
Dan Hammerstrom:
“I was Mark’s advisor for both his Undergraduate Honor’s Thesis and for PSU’s Undergraduate Research and Mentoring program. He also regularly attended my graduate research seminar. I have not had him as a student in any class, since I normally taught graduate courses.
By way of introduction, I am a Professor Emeritus of Electrical and Computer Engineering at Portland State University. I received a PhD EE degree from the University of Illinois at Urbana-Champaign. I worked at Intel in Oregon for a number of years. In 1988 I founded Adaptive Solutions, Inc., which specialized in high performance silicon technology (the CNAPS chip set) for image processing, neural network emulation, and pattern recognition. I joined the Oregon Graduate Institute in 1998 and moved to the ECE Department at Portland State in 2005. From 2012 to 2016 I was a Program Manager in the Microsystems Technology Office (MTO) at DARPA. I am a Life Fellow of the Institute of Electrical and Electronic Engineers (IEEE).
It has been a great pleasure to work with Mark. He is bright and well organized. He was diligent in the work he produced, and was always prepared for our meetings. He did quality work for both his Honor’s Thesis (title: “Combining Algorithms for more General AI”) and the URMP (title: “A Dendritic Transfer Function in a Novel Fully-Connected Layer”).
My group is studying the problem of understanding “structure” in images, which generally involves finding objects in the images by first finding the features of an object and then finding the spatial and structural relationships of the features. Extracting, representing, and performing inference over such spatial and temporal relationships is a complex and compute intensive task. We are using biologically inspired algorithms in this work. These algorithms promise faster learning, requiring only a hand full of training vectors, and allow on-going, continuous learning. Mark worked in two areas that were relevant for us. For his URMP work he modeled and created a simulation of complex neurons that included extensive dendrite functionality. He was able to read the literature and conceive of ways to build this functionality into the network as well as generate a loss function for gradient learning. In the end there were a couple of problems and he ran out of time to resolve them as well as we would have liked. Nevertheless what he did accomplish was important and useful. For his Honor’s Thesis he combined a general gradient descent deep network CNN (Convolutional Neural Network) model with a cortical model that has been developed by Numenta, called HTM (Hierarchical Temporal Memory). The Numenta simulation of the model is available for download. He used the Spatial Pooler Layer of the HTM system. Both the HTM and CNN networks were trained on a series of “blocks world” images that my graduate students are using (which we generate automatically).”