Open Access System for Information Sharing

Login Library

 

Conference
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads
Full metadata record
Files in This Item:
There are no files associated with this item.
DC FieldValueLanguage
dc.contributor.authorSUNGJAE, CHO-
dc.contributor.authorKim, Yoonsu-
dc.contributor.authorJAEWOONG, JANG-
dc.contributor.authorHWANG, INSEOK-
dc.date.accessioned2023-11-02T05:20:37Z-
dc.date.available2023-11-02T05:20:37Z-
dc.date.created2023-10-31-
dc.date.issued2023-10-12-
dc.identifier.urihttps://oasis.postech.ac.kr/handle/2014.oak/119030-
dc.description.abstract<jats:p>Imagine a near-future smart home. Home-embedded visual AI sensors continuously monitor the resident, inferring her activities and internal states that enable higher-level services. Here, as home-embedded sensors passively monitor a free person, good inferences happen randomly. The inferences&apos; confidence highly depends on how congruent her momentary conditions are to the conditions favored by the AI models, e.g., front-facing or unobstructed. We envision new strategies of AI-to-Human Actuation (AHA) that empower the sensory AIs with proactive actuation so that they induce the person&apos;s conditions to be more favorable to the AIs. In this light, we explore the initial feasibility and efficacy of AHA in the context of home-embedded visual AIs. We build a taxonomy of actuations that could be issued to home residents to benefit visual AIs. We deploy AHA in an actual home rich in sensors and interactive devices. With 20 participants, we comprehensively study their experiences with proactive actuation blended with their usual home routines. We also demonstrate the substantially improved inferences of the actuation-empowered AIs over the passive sensing baseline. This paper sets forth an initial step towards interweaving human-targeted AIs and proactive actuation to yield more chances for high-confidence inferences without sophisticating the model, in order to improve robustness against unfavorable conditions.</jats:p>-
dc.languageEnglish-
dc.publisherAssociation for Computing Machinery (ACM)-
dc.relation.isPartOfACM UbiComp 2023 (The ACM International Joint Conference on Pervasive and Ubiquitous Computing)-
dc.relation.isPartOfProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies-
dc.titleAI-to-Human Actuation: Boosting Unmodified AI&apos;s Robustness by Proactively Inducing Favorable Human Sensing Conditions-
dc.typeConference-
dc.type.rimsCONF-
dc.identifier.bibliographicCitationACM UbiComp 2023 (The ACM International Joint Conference on Pervasive and Ubiquitous Computing), pp.1 - 32-
dc.citation.conferenceDate2023-10-08-
dc.citation.conferencePlaceMX-
dc.citation.conferencePlaceCancun, Maxico-
dc.citation.endPage32-
dc.citation.startPage1-
dc.citation.titleACM UbiComp 2023 (The ACM International Joint Conference on Pervasive and Ubiquitous Computing)-
dc.contributor.affiliatedAuthorSUNGJAE, CHO-
dc.contributor.affiliatedAuthorJAEWOONG, JANG-
dc.contributor.affiliatedAuthorHWANG, INSEOK-
dc.description.journalClass1-
dc.description.journalClass1-

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse