İnce, C., Toka, M., & Baytaş, M.A. (2018). Siren: Interface for Pattern Languages. In Proceedings of the 2018 International Conference on New Interfaces for Musical Expression (NIME 2018). [forthcoming]
Baytaş, M.A., Coşkun, A., Yantaç, A.E., & Fjeld, M. (2018). Towards Materials for Computational Heirlooms: Blockchains and Wristwatches. In Proceedings of the 2018 Conference on Designing Interactive Systems (DIS 2018). [forthcoming]
Ünlüer, A. A., Baytaş, M. A., Buruk, O. T., Cemalcilar, Z., Yemez, Y., & Özcan, O. (2017). The Effectiveness of Mime‐Based Creative Drama Education for Exploring Gesture‐Based User Interfaces. International Journal of Art & Design Education.
Baytaş, M.A., Göksun, T., & Özcan, O. (2016). The Perception of Live-sequenced Electronic Music via Hearing and Sight. In Proceedings of the 2016 International Conference on New Interfaces for Musical Expression (NIME 2016).
Baytaş, M. A., Yemez, Y., & Özcan, O. (2014). Hotspotizer: end-user authoring of mid-air gestural interactions. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction (NordiCHI ‘14).
Baytaş, M.A., Batis, E., Bylund, M., Çay, D., Yantaç, A.E., & Fjeld, M. (2017). ViewFinder: Supporting the Installation and Reconfiguration of Multi-Camera Motion Capture Systems with a Mobile Application. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia (MUM 2017).
Baytaş, M.A., Yantaç, A.E., & Fjeld, M. (2017). LabDesignAR: Configuring Multi-camera Motion Capture Systems in Augmented Reality. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST ‘17).
Baytaş, M. A., Yemez, Y., & Özcan, O. (2014). User Interface Paradigms for Visually Authoring Mid-Air Gestures: A Survey and a Provocation. In Proceedings of the Workshop on Engineering Gestures for Multimodal Interfaces.
Özcan, O., Ünlüer, A., Baytaş, M.A., & Serim, B. (2012). Rethinking Spherical Media Surfaces by Re-reading Ancient Greek Vases. Paper presented at the ITS ‘12 workshop Beyond Flat Displays: Towards Shaped and Deformable Interactive Surfaces.
Baytaş, M. A., Çay, D., Yantaç, A. E., & Fjeld, M. (2017). Motion Capture in Gesture and Sign Language Research. Poster presented at the DComm conference Language as a Form of Action.
Baytaş, M.A. (2014). End-User Authoring of Mid-Air Gestural Interactions. MA thesis submitted to the Koç University Graduate School of Social Sciences and Humanities.
Arçelik A.Ş. (2015-2018): “KUAR: Koç University – Arçelik Research Center for Creative Industries” (₺8.500.000)
FP7-PEOPLE-2012-IAPP (2013-2014): “NaMoCap: Natural Motion Capture Process for Creative Industries” (Grant #324333, €658.000)
TÜBİTAK 1001 (2012-2014): “Specifications on a Design Education Methodology for Gestural Interface Design” (Grant #112E056, ₺188.000)
H2020-MSCA-ITN-2015 (2016-2020): “DComm: Deictic Communication – A Multidisciplinary Training” (Grant #676063, €3.460.823)
KoçSistem (2012-2013): “Interactive Monitoring & Control Center with Touchless 3D Gestures” (₺110.000)
European Commission Horizon 2020 Marie Skłodowska-Curie “Early Stage Researcher” Fellowship (2017-2020)
Koç University GSSSH (Graduate School of Social Sciences and Humanities) Fellowship (2014–2018)
TÜBİTAK 1001 Scientific and Technological Research Projects Funding Program Project Scholarship (2012–2014, grant #112E056)
Koç University Vehbi Koç Scholarship (2010)
Koç University Merit Scholarship (2007–2012)
Higher Education Loans and Housing Board (Yüksek Öğrenim Kredi ve Yurtlar Kurumu) Scholarship (2007–2011)