Grad Student David Baylies Performs at Electrobrass Conference

Grad Student David Baylies Performs at Electrobrass Conference 

At the beginning of November, Music Technology Master’s student David Baylies performed at the NYC Electrobrass Conference showcasing “Stella”, an electronic trumpet that David started in the summer of 2016  to allow trumpet players to use synthesized sounds within DAWS without having to change their trumpet technique. The Electrobrass Conference focuses on the advancement of American music through the combination of brass instruments and live electronics. Throughout the weekend, conference attendees had access to amazing clinics, seminars, and concerts given by some of today’s greatest musical minds!

At the conference, David performed an improvisational piece on “Stella” which a snippet can be heard here!

 

To find out more information about David’s work on “Stella”, you can find him at his website: https://www.openstella.com/

Congratulations David on your performance!

WCBS News Radio Covers SONYC Project

WCBS News Radio Covers SONYC Project 

A Huge thank you to WCBS News radio for covering the Sounds of New York City (SONYC) project! The SONYC project is a National Science Foundation funded research project in conjunction with NYU MARL and NYU Center for Urban and Science Progress that monitors NYC noise levels through a complex sensor network system and machine learning and listening techniques.

“The noise levels in the city are incredibly high,” says Charlie Mydlarz, the senior research scientist for the Sounds of New York City (SONYC) project at the NYU Center for Urban and Science Progress. “In certain locations they are at levels that the World Health Organization considers to be harmful to health.”

Check out the video here:

https://wcbs880.radio.com/articles/nyus-noise-study-new-york-city-sweet-spot-mike-sugerman?fbclid=IwAR2Z_U-O6JXpy7aTmltlixgeMkrMbykKB5XJZxG9UsIOsfbIil6DIbIlGcI

 

 

Posted on | Posted in MARL | Tagged |

WSN Covers the Holodeck

WSN Covers the Holodeck

A huge thank you to NYU’s independent student newspaper, Washington Square News, for highlighting Music Technology’s Dr. Agnieszka Roginska and her team’s work on the Holodeck! The Holodeck is “a staging environment in which participants can engage with various virtual reality environments” that has received a multi-million dollar grant from the National Science Foundation.

Check out the article down below!

The ‘Holodeck’ Propels NYU to the Future

MARL Talk: Serge Belongie

From Visipedia to PointAR  by Serge Belongie

 

Abstract:

In this talk Prof. Belongie will provide an overview of his group’s research projects at Cornell Tech involving Computer Vision, Machine Learning, and Human-in-the-Loop Computing. The talk will cover projects involving identification of plant and animal species (Visipedia) and learning perceptual embeddings of food (SNaCK). It will conclude with a preview of a new effort to build a projector-based, human-computer interaction apparatus that allows computers to point to physical objects in the real world (PointAR).

Serge Belongie received a B.S. (with honor) in EE from Caltech in 1995 and a Ph.D. in EECS from Berkeley in 2000. While at Berkeley, his research was supported by an NSF Graduate Research Fellowship. From 2001-2013 he was a professor in the Department of Computer Science and Engineering at University of California, San Diego.

He is currently a professor at Cornell Tech and the Department of Computer Science at Cornell University. His research interests include Computer Vision, Machine Learning, Crowdsourcing and Human-in-the-Loop Computing. He is also a co-founder of several companies including Digital Persona, Anchovi Labs and Orpix. He is a recipient of the NSF CAREER Award, the Alfred P. Sloan Research Fellowship, the MIT Technology Review “Innovators Under 35” Award and the Helmholtz Prize for fundamental contributions in Computer Vision.

SWiTCH Collaborates with Artist Camille Trust

The NYU Society of Women in Technology (SWiTCH) recently got the opportunity to record, engineer, and help produce 3 tracks for pop artist and singer songwriter Camille Trust over this past weekend!

From 8 am until 6 pm, SWiTCH members from all grade levels got together and set up microphones for band members and vocalists, patch up  patch bays, run consoles, troubleshoot signal flow, run Pro Tools, and engineer a recording session with Camille Trust and her band.

For more information about SWiTCH and how to be part of their upcoming events, email:

nyuswitch@gmail.com.

 

 

  Music Technology on FacebookMusic Technology on Instagram

 

 

Posted on | Posted in SWiTCH | Tagged |

MARL Talk: Research at the Sonic Arts Research Centre, Belfast (SARC)

Abstract: The talk will provide an insight into current research at the Sonic Arts Research Centre, Belfast (SARC). Dr Franziska Schroeder and Prof Pedro Rebelo will present work from SARC, including in the areas of inclusive VR design and physics-based simulation and synthesis of mechano-acoustic systems and analog circuitry for development of new digital musical instruments.
They will also give insights into practice-based sonic arts research projects including their mobile listening app ‘LiveSHOUT’ as well as their work in the area of socially engaged sonic arts practice.

Dr Franziska Schroeder (Germany / UK)

Originally from Germany, Franziska is based at the Sonic Arts Research Centre, Queen’s University Belfast where she holds the post of senior lecturer in music and sonic arts.

Franziska trained as a contemporary saxophonist in Australia, and in 2006 completed her PhD at the University of Edinburgh, where her research focused on performance, digital technologies and theories of embodiment. She has published widely in diverse international journals and has given several invited keynote speeches on the topic of performance and emerging technological platforms.

Franziska has published a book on performance and the threshold, a book on user-generated content and a volume on music improvisation (Soundweaving, 2014). She performs as saxophonist in a variety of contexts and has released several CDs on the Creative Source label, as well as a recording on the SLAM label with a semi-autonomous technological artifact. In 2015 she released an album on the pfmentum label with two Brazilian musicians, and 2016 saw the release of a Bandcamp album with her female trio Flux.

Throughout 2018 Franziska is leading a research team at Queen’s University on a project that investigates immersive technologies in collaboration with disabled musicians and Belfast’s only professional contemporary music ensemble, the Hard Rain Soloist Ensemble (HRSE). As part of this team, Franziska designed a new VR narrative work entitled “Embrace”. This piece critically investigates ideas of disability, identity, and empathy. “Embrace” is the first showcase piece created at the Sonic Arts Research Centre within its newly established research group “SARC_Immerse”, a group that has positioned itself as leading in the field of high-quality audio use in virtual environments.

Franziska leads the Performance without Barriers research group, a group of PhD and post-doctoral students investigating inclusive music technologies. At Queen’s University Belfast Franziska teaches students in improvisation, digital performance and critical theory.

Prof Pedro Rebelo (Portugal / UK)

Pedro is a composer, sound artist and performer working primarily in chamber music, improvisation and installation with new technologies. In 2002, he was awarded a PhD by the University of Edinburgh where he conducted research in both music and architecture.

His music has been presented in venues such as the Melbourne Recital Hall, National Concert Hall Dublin, Queen Elizabeth Hall, Ars Electronica, Casa da Música, and in events such as Weimarer Frühjahrstage fur zeitgenössische Musik, Wien Modern Festival, Cynetart and Música Viva. His work as a pianist and improvisor has been released by Creative Source Recordings and he has collaborated with musicians such as Chris Brown, Mark Applebaum, Carlos Zingaro, Evan Parker and Pauline Oliveros.

His writings reflect his approach to design and creative practice in a wider understanding of contemporary culture and emerging technologies. Pedro has been Visiting Professor at Stanford University (2007) and in 2012 he was appointed Professor at Queen’s and awarded the Northern Bank’s “Building Tomorrow’s Belfast” prize. He is a professor of sonic arts at the Sonic Arts Research Centre, Belfast.

MARL talk by Dan Ellis (Google)

When: Tuesday, May 15th @10am

Where: 6th floor Conference Room (609), 35 West 4th StreetMARL Logo

Title: “Supervised and Unsupervised Semantic Audio Representations”

Abstract: The Sound Understanding team at Google has been developing automatic sound classification tools with the ambition to cover all possible sounds – speech, music, and environmental.  I will describe our application of vision-inspired deep neural networks to the classification of our new ‘AudioSet’ ontology of ~600 sound events.  I’ll also talk about recent work using triplet loss to train semantic representations — where semantically ‘similar’ sounds end up close by in the representation — from unlabeled data.

Bio:  Dan Ellis joined Google in 2015 after 15 years as a faculty member in the Electrical Engineering department at Columbia University, where he headed the Laboratory for Recognition and Organization of Speech and Audio (LabROSA). He has over 150 publications in the areas of audio processing, speech recognition, and music information retrieval.

Joint work with Aren Jansen, Manoj Plakal, Ratheet Pandya, Shawn Hershey, Jiayang Liu, Channing Moore, Rif A. Saurous

Google Logo

Posted on | Posted in MARL | Tagged |

Welcome Professor Dr. Brian McFee

NYU Music Technology welcomes Dr. Brian McFee as an assistant professor in Music Technology and Data Science, effective Fall 2018!

Dr.McFee has been at NYU for the past few years as a fellow of the Center for Data Science. Previously, he was postdoctoral research scholar in the Center for Jazz Studies and LabROSA at Columbia University as well as conducting graduate research at UCSD.

Dr.McFee develops machine learning tools to analyze music and multimedia data which includes recommender systems, image and audio analysis, similarity learning, cross-modal feature integration, and automatic annotation.

Electronic Music Performance Concerts

Electronic IllustrationThis week!  Two Electronic Music Performances Wednesday / Thursday nights!
Come hear students of Dafna Naphtali’s two Electronic Music Performance classes, as they play new electronic music, compositions, experiments, and improvisations.
Wednesday May 9th, 8pm
 
Electric Pizza
NYU Electronic Music Performance – Section 001
with guest Hans Tammen conducting from Dark Circuits Orchestra score samples, drones, time-travel, facial gestures as control, and a self-designed speaker instrument.
Starring the creative and all-star students…Tristan Alleyne, Sam Grossman, Quinton Ashley, Angel E. Daniels, Ned Dana. Erez Aviram, Daksh Bhatia, Brendan Prednis, Harrison Shimazu, Emma Camell, John Sloan, Pari Songmuang. Directed by Dafna Naphtali
NYU Education Building
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Thursday May 9th, 8pm
NYU Electronic Music Performance – Section 002
with guest computer conductor by Mohamed Kubbara
plus music made with plunderphonics, waving hands, secret signals,sportscasters and more..
Starring the creative and all-star students…Gregory Borodulin, Max Chidzero, Miles Grossenbacher, Thomas Miritello, Trevor Rivkin, Ethan Uno-Portillo, Nick Royall, Greg Tock, Jake Sandakly, Emily Thaler, Zoltán Sindhu. Directed by Dafna Naphtali
                                                                              Music Technology FacebookMusic Technology Instagram

Spring 2018 Graduate Thesis Defense Presentations

                     With the end of the semester right around the corner, it’s time for our graduate thesis defense presentations.  Projects to be presented include original hardware and software development, cognition research, audio analysis, assistive audio technologies, acoustic design, and much more.

Thesis Defense Image Schedule (See Link Below)

Check out the full schedule with times and thesis titles HERE

Live Stream

All defense presentations will be live streamed! Friends and family from near and far are welcome to tune in. Feel free to share THIS link!

 Music Technology on FacebookMusic Technology on Instagram