In September, I will be starting a law degree at UBC’s Allard School of Law. My interests in this area are broad, and I expect they may change, but at the outset, I want to focus on the parts of law that help give the weakest of us a voice, in the justice system and in society in general. This includes defendant rights, free expression, and in some respects, copyright. I also want to help with public legal education and access-to-justice initiatives, to help give more of us more confidence in the outcomes of the justice system and other government decision-making processes.
I am trained as a computer scientist; in that field, I’ve focused on visual object recognition, machine learning, and applications of deep neural networks.
Currently: data scientist at Kobo, volunteer with the BCCLA, amateur sprinter.
Formerly: Head of R&D for Shelfie, Class 1 Flight Instructor (still current but not active), Canadian Forces Officer (Cadet Instructor Cadre), 2× Google Intern.
Sancho McCann. “Object classification and localization using spatially localized features”. Ph.D. Dissertation. UBC Department of Computer Science. 2014. [pdf]
Sancho McCann and David G. Lowe. “Spatially Local Coding for Object Recognition.” ACCV, 2012. [pdf] [poster] [project page]
Sancho McCann and David G. Lowe. “Local Naive Bayes Nearest Neighbor for Image Classification.” CVPR, 2012. [pdf] [project page]
A more complete list is at my Google Scholar profile.
AtmosView: Visualization Redesign
I created AtmosView, a new visualization of atmospheric sounding data (vertical profiles of the atmosphere’s temperature and humidity). People use this data to predict soaring conditions, atmospheric stability, and the likelihood of severe weather. Previous diagrams have been called the most difficult atmospheric diagrams to read. AtmosView helps people to see better the information they’re interested in and allows for easier comparisons between multiple charts.
I worked with Dr. Jacky Baltes to build a small-size humanoid robot. I coded in C and cross compiled for the ARM processor on a Sonqy Clie. I programmed it to walk and to find and kick a ball. This was our entry in the 2005 FIRA RoboWorld Cup.
At the University of Manitoba, I was part of a team that built a robot airplane that could take-off, fly a search pattern, and land—all autonomously. The airplane sent a video feed and telemetry to a ground station, where one of our teammates could mark targets of interest and report their coordinates. We placed first out of seventeen teams in a competition that included BYU, University of Texas, Cornell, MIT, and UCSD. I wrote much of the computer vision code, which transformed the video feed’s pixel coordinates into GPS coordinates, and presented that information to our ground station’s operator.