A searchable list of some of my publications is below. You can also access my publications from the following sites.
My ORCID is
Publications:
Harish Haresamudram, Irfan Essa, Thomas Ploetz
Assessing the State of Self-Supervised Human Activity Recognition using Wearables Journal Article
In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), vol. 6, iss. 3, no. 116, pp. 1–47, 2022.
Abstract | Links | BibTeX | Tags: activity recognition, IMWUT, ubiquitous computing, wearable computing
@article{2022-Haresamudram-ASSHARUW,
title = {Assessing the State of Self-Supervised Human Activity Recognition using Wearables},
author = {Harish Haresamudram and Irfan Essa and Thomas Ploetz},
url = {https://dl.acm.org/doi/10.1145/3550299
https://arxiv.org/abs/2202.12938
https://arxiv.org/pdf/2202.12938
},
doi = {doi.org/10.1145/3550299},
year = {2022},
date = {2022-09-07},
urldate = {2022-09-07},
booktitle = {Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT)},
journal = {Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT)},
volume = {6},
number = {116},
issue = {3},
pages = {1–47},
publisher = {ACM},
abstract = {The emergence of self-supervised learning in the field of wearables-based human activity recognition (HAR) has opened up opportunities to tackle the most pressing challenges in the field, namely to exploit unlabeled data to derive reliable recognition systems for scenarios where only small amounts of labeled training samples can be collected. As such, self-supervision, i.e., the paradigm of 'pretrain-then-finetune' has the potential to become a strong alternative to the predominant end-to-end training approaches, let alone hand-crafted features for the classic activity recognition chain. Recently a number of contributions have been made that introduced self-supervised learning into the field of HAR, including, Multi-task self-supervision, Masked Reconstruction, CPC, and SimCLR, to name but a few. With the initial success of these methods, the time has come for a systematic inventory and analysis of the potential self-supervised learning has for the field. This paper provides exactly that. We assess the progress of self-supervised HAR research by introducing a framework that performs a multi-faceted exploration of model performance. We organize the framework into three dimensions, each containing three constituent criteria, such that each dimension captures specific aspects of performance, including the robustness to differing source and target conditions, the influence of dataset characteristics, and the feature space characteristics. We utilize this framework to assess seven state-of-the-art self-supervised methods for HAR, leading to the formulation of insights into the properties of these techniques and to establish their value towards learning representations for diverse scenarios.
},
keywords = {activity recognition, IMWUT, ubiquitous computing, wearable computing},
pubstate = {published},
tppubtype = {article}
}
Harish Haresamudram, Irfan Essa, Thomas Ploetz
Contrastive Predictive Coding for Human Activity Recognition Journal Article
In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 5, no. 2, pp. 1–26, 2021.
Abstract | Links | BibTeX | Tags: activity recognition, IMWUT, machine learning, ubiquitous computing
@article{2021-Haresamudram-CPCHAR,
title = {Contrastive Predictive Coding for Human Activity Recognition},
author = {Harish Haresamudram and Irfan Essa and Thomas Ploetz},
url = {https://doi.org/10.1145/3463506
https://arxiv.org/abs/2012.05333},
doi = {10.1145/3463506},
year = {2021},
date = {2021-06-01},
urldate = {2021-06-01},
booktitle = {Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies},
journal = {Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies},
volume = {5},
number = {2},
pages = {1--26},
abstract = {Feature extraction is crucial for human activity recognition (HAR) using body-worn movement sensors. Recently, learned representations have been used successfully, offering promising alternatives to manually engineered features. Our work focuses on effective use of small amounts of labeled data and the opportunistic exploitation of unlabeled data that are straightforward to collect in mobile and ubiquitous computing scenarios. We hypothesize and demonstrate that explicitly considering the temporality of sensor data at representation level plays an important role for effective HAR in challenging scenarios. We introduce the Contrastive Predictive Coding (CPC) framework to human activity recognition, which captures the long-term temporal structure of sensor data streams. Through a range of experimental evaluations on real-life recognition tasks, we demonstrate its effectiveness for improved HAR. CPC-based pre-training is self-supervised, and the resulting learned representations can be integrated into standard activity chains. It leads to significantly improved recognition performance when only small amounts of labeled training data are available, thereby demonstrating the practical value of our approach.},
keywords = {activity recognition, IMWUT, machine learning, ubiquitous computing},
pubstate = {published},
tppubtype = {article}
}
Edison Thomaz, Irfan Essa, Gregory Abowd
Challenges and Opportunities in Automated Detection of Eating Activity Proceedings Article
In: Mobile Health, pp. 151–174, Springer, 2017.
Abstract | Links | BibTeX | Tags: activity recognition, computational health, ubiquitous computing
@inproceedings{2017-Thomaz-COADEA,
title = {Challenges and Opportunities in Automated Detection of Eating Activity},
author = {Edison Thomaz and Irfan Essa and Gregory Abowd},
url = {https://link.springer.com/chapter/10.1007/978-3-319-51394-2_9},
doi = {10.1007/978-3-319-51394-2_9},
year = {2017},
date = {2017-01-01},
urldate = {2017-01-01},
booktitle = {Mobile Health},
pages = {151--174},
publisher = {Springer},
abstract = {Motivated by applications in nutritional epidemiology and food journaling, computing researchers have proposed numerous techniques for automating dietary monitoring over the years. Although progress has been made, a truly practical system that can automatically recognize what people eat in real-world settings remains elusive. Eating detection is a foundational element of automated dietary monitoring (ADM) since automatically recognizing when a person is eating is required before identifying what and how much is being consumed. Additionally, eating detection can serve as the basis for new types of dietary self-monitoring practices such as semi-automated food journaling.This chapter discusses the problem of automated eating detection and presents a variety of practical techniques for detecting eating activities in real-world settings. These techniques center on three sensing modalities: first-person images taken with wearable cameras, ambient sounds, and on-body inertial sensors [34–37]. The chapter begins with an analysis of how first-person images reflecting everyday experiences can be used to identify eating moments using two approaches: human computation and convolutional neural networks. Next, we present an analysis showing how certain sounds associated with eating can be recognized and used to infer eating activities. Finally, we introduce a method for detecting eating moments with on-body inertial sensors placed on the wrist.
},
keywords = {activity recognition, computational health, ubiquitous computing},
pubstate = {published},
tppubtype = {inproceedings}
}
Edison Thomaz, Abdelkareem Bedri, Temiloluwa Prioleau, Irfan Essa, Gregory Abowd
Exploring Symmetric and Asymmetric Bimanual Eating Detection with Inertial Sensors on the Wrist Proceedings Article
In: Proceedings of the 1st Workshop on Digital Biomarkers, pp. 21–26, ACM 2017.
Links | BibTeX | Tags: activity recognition, ubiquitous computing
@inproceedings{2017-Thomaz-ESABEDWISW,
title = {Exploring Symmetric and Asymmetric Bimanual Eating Detection with Inertial Sensors on the Wrist},
author = {Edison Thomaz and Abdelkareem Bedri and Temiloluwa Prioleau and Irfan Essa and Gregory Abowd},
doi = {10.1145/3089341.3089345},
year = {2017},
date = {2017-01-01},
urldate = {2017-01-01},
booktitle = {Proceedings of the 1st Workshop on Digital Biomarkers},
pages = {21--26},
organization = {ACM},
keywords = {activity recognition, ubiquitous computing},
pubstate = {published},
tppubtype = {inproceedings}
}
Edison Thomaz, Irfan Essa, Gregory Abowd
A Practical Approach for Recognizing Eating Moments with Wrist-Mounted Inertial Sensing Proceedings Article
In: ACM International Conference on Ubiquitous Computing (UBICOMP), 2015.
Abstract | Links | BibTeX | Tags: activity recognition, computational health, machine learning, Ubicomp, ubiquitous computing
@inproceedings{2015-Thomaz-PAREMWWIS,
title = {A Practical Approach for Recognizing Eating Moments with Wrist-Mounted Inertial Sensing},
author = {Edison Thomaz and Irfan Essa and Gregory Abowd},
url = {https://dl.acm.org/doi/10.1145/2750858.2807545},
doi = {10.1145/2750858.2807545},
year = {2015},
date = {2015-09-01},
urldate = {2015-09-01},
booktitle = {ACM International Conference on Ubiquitous Computing (UBICOMP)},
abstract = {Recognizing when eating activities take place is one of the key challenges in automated food intake monitoring. Despite progress over the years, most proposed approaches have been largely impractical for everyday usage, requiring multiple on-body sensors or specialized devices such as neck collars for swallow detection. In this paper, we describe the implementation and evaluation of an approach for inferring eating moments based on 3-axis accelerometry collected with a popular off-the-shelf smartwatch. Trained with data collected in a semi-controlled laboratory setting with 20 subjects, our system recognized eating moments in two free-living condition studies (7 participants, 1 day; 1 participant, 31 days), with F-scores of 76.1% (66.7% Precision, 88.8% Recall), and 71.3% (65.2% Precision, 78.6% Recall). This work represents a contribution towards the implementation of a practical, automated system for everyday food intake monitoring, with applicability in areas ranging from health research and food journaling.
},
keywords = {activity recognition, computational health, machine learning, Ubicomp, ubiquitous computing},
pubstate = {published},
tppubtype = {inproceedings}
}
Edison Thomaz, Aman Parnami, Jonathan Bidwell, Irfan Essa, Gregory Abowd
Technological Approaches for Addressing Privacy Concerns when Recognizing Eating Behaviors with Wearable Cameras. Proceedings Article
In: ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP), 2013.
Links | BibTeX | Tags: activity recognition, computational health, privacy, Ubicomp, ubiquitous computing
@inproceedings{2013-Thomaz-TAAPCWREBWWC,
title = {Technological Approaches for Addressing Privacy Concerns when Recognizing Eating Behaviors with Wearable Cameras.},
author = {Edison Thomaz and Aman Parnami and Jonathan Bidwell and Irfan Essa and Gregory Abowd},
doi = {10.1145/2493432.2493509},
year = {2013},
date = {2013-09-01},
urldate = {2013-09-01},
booktitle = {ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP)},
keywords = {activity recognition, computational health, privacy, Ubicomp, ubiquitous computing},
pubstate = {published},
tppubtype = {inproceedings}
}
Edison Thomaz, Aman Parnami, Irfan Essa, Gregory Abowd
Feasibility of Identifying Eating Moments from First-Person Images Leveraging Human Computation Proceedings Article
In: Proceedings of ACM International SenseCam and Pervasive Imaging (SenseCam '13), 2013.
Links | BibTeX | Tags: activity recognition, behavioral imaging, computational health, ubiquitous computing, wearable computing
@inproceedings{2013-Thomaz-FIEMFFILHC,
title = {Feasibility of Identifying Eating Moments from First-Person Images Leveraging Human Computation},
author = {Edison Thomaz and Aman Parnami and Irfan Essa and Gregory Abowd},
doi = {10.1145/2526667.2526672},
year = {2013},
date = {2013-01-01},
urldate = {2013-01-01},
booktitle = {Proceedings of ACM International SenseCam and Pervasive Imaging (SenseCam '13)},
keywords = {activity recognition, behavioral imaging, computational health, ubiquitous computing, wearable computing},
pubstate = {published},
tppubtype = {inproceedings}
}
Edison Thomaz, Vinay Bettadapura, Gabriel Reyes, Megha Sandesh, Grant Schindler, Thomas Ploetz, Gregory Abowd, Irfan Essa
Recognizing Water-Based Activities in the Home Through Infrastructure-Mediated Sensing Proceedings Article
In: ACM International Conference on Ubiquitous Computing (UBICOMP), 2012.
Links | BibTeX | Tags: aware home, intelligent environments, ubiquitous computing
@inproceedings{2012-Thomaz-RWAHTIS,
title = {Recognizing Water-Based Activities in the Home Through Infrastructure-Mediated Sensing},
author = {Edison Thomaz and Vinay Bettadapura and Gabriel Reyes and Megha Sandesh and Grant Schindler and Thomas Ploetz and Gregory Abowd and Irfan Essa},
url = {http://www.ethomaz.com/2012/09/05/activity-rec-ims-ubicomp-2012/},
doi = {10.1145/2370216.2370230},
year = {2012},
date = {2012-09-01},
urldate = {2012-09-01},
booktitle = {ACM International Conference on Ubiquitous Computing (UBICOMP)},
keywords = {aware home, intelligent environments, ubiquitous computing},
pubstate = {published},
tppubtype = {inproceedings}
}
Jing Wang, Grant Schindler, Irfan Essa
Orientation Aware Scene Understanding for Mobile Camera Proceedings Article
In: ACM International Conference on Ubiquitous Computing (UBICOMP), 2012.
Links | BibTeX | Tags: mobile vision, scene understanding, ubiquitous computing
@inproceedings{2012-Wang-OASUMC,
title = {Orientation Aware Scene Understanding for Mobile Camera},
author = {Jing Wang and Grant Schindler and Irfan Essa},
url = {http://www.cc.gatech.edu/cpl/projects/orientation-aware/},
doi = {10.1145/2370216.2370258},
year = {2012},
date = {2012-09-01},
booktitle = {ACM International Conference on Ubiquitous Computing (UBICOMP)},
keywords = {mobile vision, scene understanding, ubiquitous computing},
pubstate = {published},
tppubtype = {inproceedings}
}
Other Publication Sites
A few more sites that aggregate research publications: Academic.edu, Bibsonomy, CiteULike, Mendeley.
Copyright/About
[Please see the Copyright Statement that may apply to the content listed here.]
This list of publications is produced by using the teachPress plugin for WordPress.