


default search action
NIME 2010: Sydney, Australia
- 10th International Conference on New Interfaces for Musical Expression, NIME 2010, Sydney, Australia, June 15-18, 2010. nime.org 2010

- Owen Vallis, Jordan Hochenbaum, Ajay Kapur:

A Shift Towards Iterative and Open-Source Design for Musical Interfaces. 1-6 - Yutaro Maruyama, Yoshinari Takegawa, Tsutomu Terada, Masahiko Tsukamoto:

UnitInstrument: Easy Configurable Musical Instruments. 7-12 - Jos Mulder

:
The Loudspeaker as Musical Instrument. 13-18 - Miha Ciglar:

An Ultrasound Based Instrument Generating Audible and Tactile Sound. 19-22 - Ted Hayes:

Neurohedron: A Nonlinear Sequencer Interface. 23-25 - Nobuyuki Umetani, Jun Mitani, Takeo Igarashi:

Designing Custom-made Metallophone with Concurrent Eigenanalysis. 26-30 - Sungkuk Chun, Andrew Hawryshkewich, Keechul Jung, Philippe Pasquier:

Freepad: A Custom Paper-based MIDI Interface. 31-36 - John A. Mills, Damien Di Fede, Nicolas Brix:

Music Programming in Minim. 37-42 - Thor Magnusson:

An Epistemic Dimension Space for Musical Devices. 43-46 - Ahmet Baki Kocaballi, Petra Gemeinboeck, Rob Saunders:

Investigating the Potential for Shared Agency using Enactive Interfaces. 47-50 - Noah Liebman, Michael Nagara, Jacek Spiewla, Erin Zolkosky:

Cuebert: A New Mixing Board Concept for Musical Theatre. 51-56 - Charles Roberts, Matthew Wright, JoAnn Kuchera-Morin, Lance Putnam:

Dynamic Interactivity Inside the AlloSphere. 57-62 - Florian Alt, Alireza Sahami Shirazi, Stefan Legien, Albrecht Schmidt, Julian Mennenöh:

Creating Meaningful Melodies from Text Messages. 63-68 - Tim Humphrey, Madeleine Flynn, Jesse Stevens:

Epi-thet: A Musical Performance Installation and a Choreography of Stillness. 69-71 - Tilo Hähnel:

From Mozart to MIDI: A Rule System for Expressive Articulation. 72-75 - Georg Essl, Alexander Müller:

Designing Mobile Musical Instruments and Environments with urMus. 76-81 - Jieun Oh, Jorge Herrera, Nicholas J. Bryan, Luke Dahl, Ge Wang:

Evolving The Mobile Phone Orchestra. 82-87 - Atau Tanaka:

Mapping Out Instruments, Affordances, and Mobiles. 88-93 - Robyn Taylor, Guy Schofield, John Shearer, Pierre Boulanger, Jayne Wallace, Patrick Olivier:

humanaquarium: A Participatory Performance System. 88-93 - Mark Havryliv:

Composing For Improvisation with Chaotic Oscillators. 94-99 - Andrew Hawryshkewich, Philippe Pasquier, Arne Eigenfeldt:

Beatback: A Real-time Interactive Percussion System for Rhythmic Practise and Exploration. 100-105 - Michael Gurevich, Paul Stapleton, Adnan Marquez-Borbon:

Style and Constraint in Electronic Musical Instruments. 106-111 - Hongchan Choi, Ge Wang:

LUSH: An Organic Eco + Music System. 112-115 - Tomoyuki Yamaguchi, Tsukasa Kobayashi, Anna Ariga, Shuji Hashimoto:

TwinkleBall: A Wireless Musical Interface for Embodied Sound Media. 116-119 - Joanne Cannon, Stuart Favilla:

Expression and Spatial Motion: Playable Ambisonics. 120-124 - Nick Collins:

Contrary Motion: An Oppositional Interactive Music System. 125-129 - Etienne Deleflie, Greg Schiemer:

Images as Spatial Sound Maps. 130-135 - Kevin Schlei:

Relationship-Based Instrument Mapping of Multi-Point Data Streams Using a Trackpad Interface. 136-139 - Lonce Wyse, Nguyen Dinh Duy:

Instrumentalizing Synthesis Models. 140-143 - Álvaro Cassinelli, Yusaku Kuribara, Alexis Zerroug, Masatoshi Ishikawa, Daito Manabe:

scoreLight: Playing with a Human-Sized Laser Pick-Up. 144-149 - Karl Yerkes, Greg Shear, Matthew Wright:

Disky: a DIY Rotational Interface with Inherent Dynamics. 150-155 - Jorge Solis, Klaus Petersen, Tetsuro Yamamoto, Masaki Takeuchi, Shimpei Ishikawa, Atsuo Takanishi, Kunimatsu Hashimoto:

Development of the Waseda Saxophonist Robot and Implementation of an Auditory Feedback Control. 156-161 - Ajay Kapur, Michael Darling:

A Pedagogical Paradigm for Musical Robotics. 162-165 - Ye Pan, Min-Gyu Kim, Kenji Suzuki:

A Robot Musician Interacting with a Human Partner through Initiative Exchange. 166-169 - Ivica Bukvic, Thomas Martin, Eric Standley, Michael Matthews:

Introducing L2Ork: Linux Laptop Orchestra. 170-173 - Nicholas J. Bryan, Jorge Herrera, Jieun Oh, Ge Wang:

MoMu: A Mobile Music Toolkit. 174-177 - Luke Dahl, Ge Wang:

Sound Bounce: Physical Metaphors in Designing Mobile Music Performance. 178-181 - Georg Essl, Michael Rohs, Sven G. Kratz:

Use the Force (or something) - Pressure and Pressure - Like Input for Mobile Music Performance. 182-185 - Roger Mills:

Dislocated Sound: A Survey of Improvisation in Networked Audio Platforms. 186-191 - Florent Berthaut, Myriam Desainte-Catherine, Martin Hachet:

DRILE: An Immersive Environment for Hierarchical Live-Looping. 192-197 - Robin Fencott, Nick Bryan-Kinns:

Hey Man, You're Invading my Personal Space ! Privacy and Awareness in Collaborative Music. 198-203 - Charles Martin

, Benjamin Forster, Hanna Cormick:
Cross-Artform Performance Using Networked Interfaces: Last Man to Die's Vital LMTD. 204-207 - Alexander Refsum Jensenius, Kjell Tore Innervik, Ivar Frounberg:

Evaluating the Subjective Effects of Microphone Placement on Glass Instruments. 208-211 - Rudolfo Quintas:

Glitch Delighter: Lighter's Flame Base Hyper-Instrument for Glitch Music in Burning The Sound Performance. 212-216 - Andrew P. McPherson, Youngmoo E. Kim

:
Augmenting the Acoustic Piano with Electromagnetic String Actuation and Continuous Key Position Sensing. 217-222 - Cesar M. Grossmann:

Developing a Hybrid Contrabass Recorder Resistances, Expression, Gestures and Rhetoric. 223-228 - Alfonso Pérez Carrillo, Jordi Bonada:

The Bowed Tube: a Virtual Violin. 229-232 - Jordan Hochenbaum, Ajay Kapur, Matthew Wright:

Multimodal Musician Recognition. 233-237 - Enric Guaus, Tan Hakan Özaslan, Eric Palacios, Josep Lluís Arcos:

A Left Hand Gesture Caption System for Guitar Based on Capacitive Sensors. 238-243 - Andrew Schmeder, Adrian Freed:

Support Vector Machine Learning for Gesture Signal Estimation with a Piezo-Resistive Fabric Touch Surface. 244-249 - Jan C. Schacher:

Motion To Gesture To Sound: Mapping For Interactive Dance. 250-254 - Ian Whalley:

Generative Improv. & Interactive Music Project (GIIMP). 255-258 - Kristian Nymoen, Kyrre Glette, Ståle Andreas Skogstad, Jim Tørresen, Alexander Refsum Jensenius:

Searching for Cross-Individual Relationships between Sound and Movement Features using an SVM Classifier. 259-262 - Takashi Baba, Mitsuyo Hashida, Haruhiro Katayose:

"VirtualPhilharmony": A Conducting System with Heuristics of Conducting an Orchestra. 263-270 - Tobias Großhauser, Ulf Großekathöfer, Thomas Hermann:

New Sensors and Pattern Recognition Techniques for String Instruments. 271-276 - Tilo Hähnel, Axel Berndt:

Expressive Articulation for Synthetic Music Performances. 277-282 - Andrew R. Brown:

Network Jamming: Distributed Performance using Generative Music. 283-286 - Ivar Frounberg, Kjell Tore Innervik, Alexander Refsum Jensenius:

Glass Instruments - From Pitch to Timbre. 287-290 - Chris Kiefer:

A Malleable Interface for Sonic Exploration. 291-296 - Victor Zappi, Andrea Brogni, Darwin G. Caldwell:

OSC Virtual Controller. 297-302 - Smilen Dimitrov:

Extending the Soundcard for Use with Generic DC Sensors Demonstrated by Revisiting a Vintage ISA Design. 303-308 - Sylvain Le Groux, Jônatas Manzolli, Paul F. M. J. Verschure:

Disembodied and Collaborative Musical Interaction in the Multimodal Brain Orchestra. 309-314 - Jordan Hochenbaum, Owen Vallis, Dimitri Diakopoulos, Jim W. Murphy, Ajay Kapur:

Designing Expressive Musical Interfaces for Tabletop Surfaces. 315-318 - Wendy Suiter:

Toward Algorithmic Composition of Expression in Music Using Fuzzy Logic. 319-322 - Kirsty A. Beilharz, Andrew Vande Moere, Barbara Stiel, Claudia A. Calò, Martin Tomitsch, Adrian Lombard:

Expressive Wearable Sonification and Visualisation: Design and Evaluation of a Flexible Display. 323-326 - Jeremiah Nugroho, Kirsty A. Beilharz:

Understanding and Evaluating User Centred Design in Wearable Expressions. 327-330 - Sihwa Park, Seunghun Kim, Samuel Lee, Woon Seung Yeo:

Online Map Interface for Creative and Interactive. 331-334 - Aristotelis Hadjakos, Max Mühlhäuser:

Analysis of Piano Playing Movements Spanning Multiple Touches. 335-338 - Sebastian Heinz, Sile O'Modhrain:

Designing a Shareable Musical TUI. 339-342 - Adrian Freed:

Visualizations and Interaction Strategies for Hybridization Interfaces. 343-347 - Björn Wöldecke, Christian Geiger, Holger Reckter, Florian Schulz:

ANTracks 2.0 - Generative Music on Multiple Multitouch Devices Categories and Subject Descriptors. 348-351 - Laewoo Kang, Hsin-Yi Chien:

Hé: Calligraphy as a Musical Interface. 352-355 - Martin Marier:

The Sponge A Flexible Interface. 356-359 - Lawrence Fyfe

, Sean Lynch, Carmen Hull, Sheelagh Carpendale:
SurfaceMusic: Mapping Virtual Touch-based Instruments to Physical Models. 360-363 - Aengus Martin, Sam Ferguson, Kirsty A. Beilharz:

Mechanisms for Controlling Complex Sound Sources: Applications to Guitar Feedback Control. 364-367 - Jim Tørresen, Eirik Renton, Alexander Refsum Jensenius:

Wireless Sensor Data Collection based on ZigBee Communication. 368-371 - Javier Jaimovich, Benjamin Knapp:

Synchronization of Multimodal Recordings for Musical Performance Research. 372-374 - Giuseppe Torre

, Mark O'Leary, Brian Tuohy:
POLLEN A Multimedia Interactive Network Installation. 375-376 - Xiaoyang Feng:

Irregular Incurve. 377-379 - Chikashi Miyama:

Peacock: A Non-Haptic 3D Performance Interface. 380-382 - Jukka Holm, Harri Holm, Jarno Seppänen:

Associating Emoticons with Musical Genres. 383-386 - Yoichi Nagashima:

Untouchable Instrument "Peller-Min". 387-390 - Javier Jaimovich:

Ground Me ! An Interactive Sound Art Installation. 391-394 - Norma Saiph Savage, Syed R. Ali, Norma Elva Chávez:

Mmmmm: A Multi-modal Mobile Music Mixer. 395-398 - Chih-Chieh Tsai, Cha-Lin Liu, Teng-Wen Chang:

An Interactive Responsive Skin for Music. 399-402 - Nick Bryan-Kinns, Robin Fencott, Oussama Metatla, Shahin Nabavian, Jennifer G. Sheridan:

Interactional Sound and Music: Listening to CSCW, Sonification, and Sound Art. 403-406 - Ståle Andreas Skogstad, Alexander Refsum Jensenius, Kristian Nymoen:

Using IR Optical Marker Based Motion Capture for Exploring Musical Interaction. 407-410 - Benjamin Buch, Pieter Coussement, Lüder Schmidt:

"playing robot": An Interactive Sound Installation in Human-Robot Interaction Design for New Media Art. 411-414 - Loïc Reboursière, Christian Frisson, Otso Lähdeoja, John A. Mills, Cécile Picard-Limpens, Todor Todoroff:

Multimodal Guitar: A Toolbox For Augmented Guitar Performances. 415-418 - Michael Berger:

The GRIP MAESTRO: Idiomatic Mappings of Emotive Gestures for Control of Live Electroacoustic Music. 419-422 - Kimberlee Headlee, Tatyana Koziupa, Diana Siwiak:

Sonic Virtual Reality Game: How Does Your Body Sound ? 423-426 - Alex Stahl, Patricia Clemens:

Auditory Masquing: Wearable Sound Systems for Diegetic Character Voices. 427-430 - Paul Rothman:

The Ghost: An Open-Source, User Programmable MIDI Performance Controller. 431-435 - Garth Paine:

Towards a Taxonomy of Realtime Interfaces for Electronic Music Performance. 436-439 - Hyun-Soo Kim, Je-Han Yoon, Moon-Sik Jung:

Interactive Music Studio: The Soloist. 444-446 - Pierre Alexandre Tremblay, Diemo Schwarz:

Surfing the Waves: Live Audio Mosaicing of an Electric Bass Performance as a Corpus Browsing Interface. 447-450 - A. Cavan Fyans, Michael Gurevich, Paul Stapleton:

Examining the Spectator Experience. 451-454 - Nick Collins, Chris Kiefer, Zeeshan Patoli, Martin White:

Musical Exoskeletons: Experiments with a Motion Capture Suit. 455-458 - Jim W. Murphy, Ajay Kapur, Carl Burgin:

The Helio: A Study of Membrane Potentiometers and Long Force Sensing Resistors for Musical Interfaces. 459-462 - Stuart Taylor, Jonathan Hook:

FerroSynth: A Ferromagnetic Music Interface. 463-466 - Josh Dubrau, Mark Havryliv:

P[a]ra[pra]xis: Towards Genuine Realtime 'Audiopoetry'. 467-468 - Kris Makoto Kitani, Hideki Koike:

ImprovGenerator: Online Grammatical Induction for On-the-Fly Improvisation Accompaniment. 469-472 - Christian Frisson, Benoît Macq, Stéphane Dupont, Xavier Siebert, Damien Tardieu, Thierry Dutoit:

DeviceCycle: Rapid and Reusable Prototyping of Gestural Interfaces, Applied to Audio Browsing by Similarity. 473-476 - Alexander Müller, Fabian Hemmert, Götz Wintergerst, Ron Jagodzinski:

Reflective Haptics: Resistive Force Feedback for Musical Performances with Stylus-Controlled Instruments. 477-478 - Alison Mattek, Mark Freeman, Eric Humphrey:

Revisiting Cagean Composition Methodology with a Modern Computational Implementation. 479-480 - Sam Ferguson, Emery Schubert, Catherine J. Stevens:

Movement in a Contemporary Dance Work and its Relation to Continuous Emotional Response. 481-484 - Teemu Ahmaniemi:

Gesture Controlled Virtual Instrument with Dynamic Vibrotactile Feedback. 485-488 - Jeffrey Hass:

Creating Integrated Music and Video for Dance: Lessons Learned and Lessons Ignored. 489-492 - Warren Burt:

Packages for ArtWonk: New Mathematical Tools for Composers. 493-496 - Jace Miller, Tracy Hammond:

Wiiolin: a Virtual Instrument Using the Wii Remote. 497-500 - Max Meier, Max Schranner:

The Planets. 501-504

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.


Google
Google Scholar
Semantic Scholar
Internet Archive Scholar
CiteSeerX
ORCID














