Crowdsourced Audio Annotation and Quality Evaluation
Mendez, A.E.M., Cartwright, M., Bello, J.P., Nov, O. Eliciting Confidence for Improving Crowdsourced Audio Annotations. In Proceedings of the ACM on Human-Computer Interaction, vol. 6(CSCW1), 2022.
Mendez, A.E.M., Cartwright, M., Bello, J.P. Machine-Crowd-Expert Model for Increasing User Engagement and Annotation Quality. In Proceedings of ACM CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA), 2019.
Cartwright, M., Dove, G., Mendez, A.E.M., Bello, J.P., Nov, O. Crowdsourcing Multi-label Audio Annotation Tasks with Citizen Scientists. In Proceedings of ACM Conference on Human Factors in Computing Systems (CHI), 2019.
Cartwright, M., Salamon, J., Seals, A., Nov, O., Bello, J.P. Investigating the Effect of Sound-Event Loudness on Crowdsourced Audio Annotations. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2018.
Cartwright, M., Pardo, B., Mysore, G. Crowdsourced Pairwise-Comparison for Source Separation Evaluation. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2018.
Cartwright, M., Seals, A., Salamon, J., Williams, A., Mikloska, S., MacConnell, D., Law, E., Bello, J.P., Nov, O. Seeing Sound: Investigating the Effects of Visualizations and Complexity on Crowdsourced Audio Annotations. In Proceedings of the ACM on Human-Computer Interaction, vol. 1(2): Computer-Supported Cooperative Work and Social Computing, 2017.