SURF-IT Research Projects

 

Participants | Research Projects

The SURF-IoT program provides a unique 10-week summer research opportunity for UCI undergraduates to become immersed in research and applications related to the Internet of Things (IoT).

Calit2 faculty, students and research professionals with leading California technology companies conduct research in “living laboratories” focused on the scientific, technological, and social components related to the Internet of Things.

2015 SURF-IoT Research Projects

The following faculty-mentored research projects are available during the 2015 SURF-IoT Program. Select a link for an overview of the project, associated faculty mentors, project prerequisites, and related publications.

    1) Cloud-Based Tools to Empower Interdisciplinary Research: A Case Study on the Mesoamerican Color Survey Data Archive 

    2) Developing a Mobile Application to Address Obesity among Latina Breast Cancer Survivors 

    3) Health360 – Comprehensive, Intuitive and Interactive View of Health Aspects to Drive Self-Healthcare and Social Well-Being 

    4) M2M-Tracking: Encounter-Based Collaborative Localization using M2M-Sisted Particle Filter 

    5) PainBuddy: Using Mobile Applications to Assist and Track Home-Based Therapy for Children Suffering from Cancer 

    6) Qualoscopy 

    7) Using Wearable and Connected Sensing Platforms to Augment Pediatric Informatics 

    8) Vestibular Rehabilitation using Wide-Angled Head Mounted Displays with Stereoscopic 3D 




 Project #1:  Cloud-Based Tools to Empower Interdisciplinary Research: A Case Study on the Mesoamerican Color Survey Data Archive
Faculty Mentor:  Dr. Kimberly A. JamesonInstitute for Mathematical Behavioral Sciences

Description:  Additional Faculty Mentors:
Sergio Gago, Ph.D.
California Institute for Telecommunications and Information Technology

Description:
In a world where networked technology and digital information create greater numbers of data archives and corpora, there is a need for understanding ways to manage and investigate such
archives. Unfortunately, large datasets are difficult to manage and share in collaborative working environments, demanding a high level of expertise in database technologies. To date, very few intuitive tools have been reported and are available for cross-disciplinary research involving large datasets.

In this SURF-IoT project we will offer students the opportunity to acquire highly applicable skills for building intuitive interfaces for cloud-based collaborative spaces and specific internet-based tools for multidisciplinary research involving large datasets. As a case study, we will use tools
such as a wiki web application and crowdsourcing to create a valuable categorization data archive including more than 160 indigenous languages.

The database and its integrated toolbox will be a “one-stop” fully-integrated platform for training and original research, and as such will be unlike any other similar data archive. This project will lead to an unequaled resource for many student research projects and a potential basis for student publications.

Students' Involvement and Expected Outcomes:

Students mentored in this project will work in a proactive collaborative research team guided by faculty to develop and use cutting-edge web applications to build a collaborative research environment (wiki-based), including intuitive interface to manage large datasets, content management system (CMS) to preserve and update data, and crowdsourcing procedures (MTurk-based) to achieve hard paper original data transcriptions into a digital archive. The case study of this project and the long-term final goal is to digitalize an archival data set on indigenous languages from Mesoamerica and elsewhere, which will allow the preservation of the data and
will enable the use of cloud-based tools to empower world-wide cross-disciplinary research based on a never before published categorization data from ~160 living languages (estimated at over 3,000 participants’ standardize responses).

Specifically, students will work with faculty to:

➢ Design and develop an intuitive collaborative web environment for world-wide interdisciplinary research based in Wikimedia.
➢ Learn how to design and work with crowd-sourced, internet-based, survey methods, such as M-turk.
➢ Design and develop procedures for collecting large-scale transcriptions and aggregating transcribing data to create a valuable database.
➢ Use computer algorithms to help preserve knowledge about the world’s vanishing languages.
➢ Develop skills and invent new ways to collect, handle and aggregate data for large corpora which will be extendable and valuable for a wide-range of internet-based computing needs in the future.

Prerequisites: We are looking for a diverse team with interests and skills in informatics and computer sciences who care about practical applications concerning internet approaches to large-scale database development, and in STEM (science, technology, engineering, and mathematics), culture and language, and psychology. Those interested in math and science, research design, and the practical use of internet computing that influences important cultural outcomes are encouraged to
apply. Innovative thinkers with real-world experiences or relevant experience are encouraged to apply.

Recommended Web sites and publications: 
   The MesoAmerican Color Survey project.: http://aris.ss.uci.edu/~kjameson/FundingMesoAmericanColorSurveyArchive.pdf
   The World Color Survey Database: http://www1.icsi.berkeley.edu/wcs/
   Berlin, Brent and Paul Kay. Basic Color Terms: Their Universality and Evolution. Berkeley and Los Angeles. University of California Press, 1969.:
   Excerpts from Funded NSF Research Plan: IBSS: New methods for investigating the formation of individual and shared concepts and their dynamic dispersion across related societies.: http://aris.ss.uci.edu/~kjameson/NSF2014ResearchPlan.pdf
   J. J. Chen, N. J. Menezes, and A. D. Bradley. 2011).“Opportunities for Crowdsourcing Research
on Amazon Mechanical Turk.” Amazon Mechanical Turk. 410 Terry Ave North. Seattle, WA 98109. Apr 08, 2011.: http://www.crowdsourcing.org/document/opportunities-for-crowdsourcing-research-on-amazon-mechanical-turk/3471
   Brabham, D. C. (2008). Crowdsourcing as a Model for Problem Solving: An Introduction and Cases. Convergence: The International Journal of Research into New Media Technologies, 14(1),75–90. doi:10.1177/1354856507084420:
   Chai, X., Vuong, B. Q., Doan, a. H., & Naughton, J. F. (2009). Efficiently incorporating user feedback into information extraction and integration programs. Proceedings of the 35th SIGMOD International Conference on Management of Data, 87–100. doi:10.1145/1559845.1559857:
   Doan, A., Ramakrishnan, R., & Halevy, A. Y. (2011). Crowdsourcing systems on the World-Wide Web. Communications of the ACM, 54(4), 86. doi:10.1145/1924421.1924442:
   Kittur, A., Chi, E. H., & Suh, B. (2008). Crowdsourcing user studies with Mechanical Turk. Proceeding of the Twenty-Sixth Annual CHI Conference on Human Factors in Computing Systems - CHI ’08, 453. doi:10.1145/1357054.1357127:
   Kittur, A., Suh, B., Pendleton, B. a., & Chi, E. H. (2007). He says, she says: conflict and coordination in Wikipedia. ACM Conference on Human Factors in Computing Systems, 453 – 462. doi:10.1145/1240624.1240698:
   Viégas, F., Wattenberg, M., & McKeon, M. (2007). The Hidden Order of Wikipedia. Online Communities and Social Computing, 4564, 445–454. doi:10.1007/978-3-540-73257-0_49:



 Project #2:  Developing a Mobile Application to Address Obesity among Latina Breast Cancer Survivors
Faculty Mentor:  Professor Dara Heather SorkinMedicine

Description:  Additional Mentors:
Alfred Kobsa - Professor, Department of Informatics
Yunan Chen - Associate Professor, Department of Informatics

Project Description: Breast cancer is the most common cancer diagnosed in Latinas. While the five-year survival rate for Latinas is high, this population experiences disparities in symptoms (e.g., fatigue and negative mood), comorbidities (e.g., diabetes), and lifestyle and behavioral risk factors during survivorship. Self-management strategies during survivorship that include a healthy diet, exercise, and weight management have been associated with reduced risk of morbidity and mortality. However, Latina breast cancer survivors (LBCS) have high rates of obesity and fatigue, and are significantly less likely to engage in physical activity relative to their non-Hispanic white and African American counterparts. Barriers to engaging in a healthy lifestyle include fatigue, lack of knowledge, low socioeconomic status (SES), cultural norms and perceptions, and lack of support.

Current state-of-the-art protocols, which have been demonstrated in clinical studies to be effective in changing behavior and improving outcomes in cancer patients and survivors, include clinic-based educational and support programs. These programs, however, rarely include individual personalization to patients’ specific cultural and demographic characteristics. Perhaps for this reason, despite having been shown to be effective in clinical studies, these programs have shown limited success for maintaining the behavior change.

Personalized technology (e.g., smartphones) plays a key role in knowledge acquisition and self-management in health and wellness. Mobile health (M-health) application use is on the rise, as an estimated 500 million Smartphone users worldwide will be using medical- or health-related mobile applications by 2015. Capitalizing on a broad and eager audience, a multitude of mobile applications have been developed and advertised as tools to promote health and/or manage chronic disease. Yet, the vast majority of health-related M-health applications are neither evidence-based nor rigorously evaluated. Thus, M-health, an integral part of daily life for millions of users, provides a unique and highly relevant platform for behavioral intervention and long-term self-management during survivorship with the potential for wide dissemination and low-cost delivery.

We plan to develop a smartphone based system, called Mi Salud (My Health), that allows LBCS to record their intake of food, exercise, medication, weight, fatigue and mood. Mi Salud, Mi Vida (My Health, My Life) goes beyond patient monitoring of health behaviors to return personalized information about the relationship between a patient’s actions and/or mood states and her health behaviors. In other words, in addition to the general knowledge that Mi Salud records and provides, it also prompts patients to develop individualized and personalized knowledge.

We propose to test the feasibility of using Mi Salud, Mi Vida, and to modify and improve it based on feedback from users.

Impact: If the Mi Salud, Mi Vida intervention demonstrates efficacy, it will offer a novel, low-cost strategy for long-term self-management to address disparities in health and lifestyle risk factors in LBCS. Furthermore, it will pave the way for an R01 to test the effectiveness of this strategy in other cancer populations, and will eventually lead to a prototype app with important clinical outcomes and dissemination potential. If successful, the approach could be adapted to other chronic care conditions that lend themselves to self-management.

Student’s Involvement and Expected Outcomes:

Students participating in this project will collaborate with faculty and work together to do the following:

●Through extensive research on breast cancer survivors, determine health factors that are essential to track for promoting weight loss.

●Develop ways to aggregate data into purposeful and meaningful information for the user.

●Assist in user testing of the mobile app and discuss ways in which the design can be improved based on data from user-testing.

●Update the mobile app wireframe based on conclusions made from detailed discussion with mentors and findings from research and data.

●Collaborate with the developer to generate user-interface and user-experience solutions while communicating task flows.

Prerequisites: The project is expected to appeal to students from various majors, but preference will be given to those with backgrounds in health and information and computer science. Those whose interests are in health informatics, human-computer interaction, and/or user-interface and user-experience design and those having experience with wire framing software are encouraged to apply.

Recommended Web sites and publications: 
   Broderick, J., Devine, T., Langhans, E., Lemerise, A. J., Lier, S., and Harris, L. (2014). Designing Health Literate Mobile Apps.

http://www.iom.edu/~/media/Files/Perspectives-
Files/2014/Discussion-Papers/BPH-HealthLiterateApps.pdf
: http://www.iom.edu/~/media/Files/Perspectives-Files/2014/Discussion-Papers/BPH-HealthLiterateApps.pdf
   7 Pitfalls to Avoid in mHealth Web & App Design: http://bridgedesign.com/7-pitfalls-t
o-avoid-in-mhealth-web-app-design/
: http://bridgedesign.com/7-pitfalls-to-avoid-in-mhealth-web-app-design/
   Dehling, T., Gao, F., Schneider, S., and Sunyaev, A. (2015). Exploring the Far Side of
Mobile Health: Information Security and Privacy of Mobile Health Apps on iOS and
Android. http://mhealth.jmir.org/2015/1/e8/
: http://mhealth.jmir.org/2015/1/e8/
   Fu OS, Crew KD, Jacobson JS, et al. Ethnicity and persistent symptom burden in breast cancer survivors. J Cancer Surviv 2009;3:241-50.:
   Buffart LM, Ros WJ, Chinapaw MJ, et al. Mediators of physical exercise for improvement in cancer survivors' quality of life. Psychooncology 2014;23:330.
:
   Bender JL, Yue RY, To MJ, Deacken L, Jadad AR. A lot of action, but not in the right direction: systematic review and content analysis of smartphone applications for the prevention, detection, and management of cancer. J Med Internet Res 2013;15:e287.:
   Luoma ML, Hakamies-Blomqvist L, Blomqvist C, Nikander R, Gustavsson-Lilius M, Saarto T. Experiences of breast cancer survivors participating in a tailored exercise intervention -a qualitative study. Anticancer Res 2014;34:1193-9.:



 Project #3:  Health360 – Comprehensive, Intuitive and Interactive View of Health Aspects to Drive Self-Healthcare and Social Well-Being
Faculty Mentor:  Professor John T. BillimekMedicine

Description:  Project Description:

The United States has the highest health care cost inflation among leading developed nations.(1) Between 2006 and 2010, the healthcare costs in the U.S. increased by a staggering 19%.(2) Even more importantly, the overspending in healthcare in the U.S. due to overuse is estimated to be $750 billion.(3) It is deeply perplexing to see such statistics for a nation whose talented healthcare providers and technology are among the world’s best. One of the most promising solutions is patient-centered healthcare. But for that to succeed, the patient has to play the most critical role by taking ownership of their health, which would include educating themselves about their medical conditions, proactively following prescribed medication, continuously monitoring health metrics, regular wellness activities, and more. Patients are eager to play that role; after all it’s all about their health. However, the biggest impediment is the lack of data as well as technology to make that data highly intuitive and actionable.

The Internet of Things (IoT) provides a great opportunity to fix this problem through interconnected devices, continuous data collection and reporting, which can enable seamless automated health insights delivered anytime, anywhere, on any device.

In the last few years we have seen a proliferation of portable health sensors, many of which are already available in the market at affordable prices. However, the true value of the data collected from such sensors lies in integrating the data from all sensors, combining it with a patient’s medical history, providing doctors the ability to monitor the changes in the patient’s health metrics with regards to on-going medication, and making a patient more aware of the impact of his/her casual decisions pertaining to healthcare (such as not adhering to the timely consumption of prescribed medicines).

Health360 solves this problem through using IoT technology-based devices, along with human-computer interaction and information technology to deliver a simple, customizable dashboard providing comprehensive view of the user’s health aspects. Health360 not only informs its users, it also educates them and assists them in carefully following their medication as well as pursuing wellness goals. A simple, yet powerful design will enable users to manage their health through Health360, without getting lost in the details or complexity of the medical terminology. Besides individual health, social well-being is also promoted by Health360 through features such as Wellness Challenger and Voice of Patient which enable social interaction, while maintaining privacy when required.

Please see Page 2 of http://www.urop.uci.edu/surf-it/2015_summer/Health360.pdf (link below) for a sample Health360 screen.

Sources:
1. Wikipedia (http://en.wikipedia.org/wiki/Health_care_finance_in_the_United_States)
2. Health Care Transparency 101 - White Paper, Castlight Health
3. Institute of Medicine of the National Academies, “Best Care at Lower Cost: The Path to Continuously Learning Health Care in America,” September 2012

Health360 will be developed entirely based on Open Source software and utilities in order to encourage collaboration from other universities as well as to offer this service to end users for free (by avoiding any licensing or subscription costs).

Student’s Involvement and Expected Outcomes:

Student Activities:

•Write code snippets to collect and integrate the data from different health sensor devices (IoT enabled). This will include some preprocessing and transformations such as data normalization, using standard metrics, etc.
•Design and build a multi-platform seamless experience for users, through which users can access Health360 seamlessly over computers, tablets and smartphones. (This will be done in phases. Initially the focus would be to deliver a minimum viable product for the web interface only, i.e. computer access)
•Perform a comprehensive usability assessment of the interface's features through focus groups and rigorous A/B testing (the scope for this activity will depend on how much time is left after the above two tasks).
•Design creative elements for the interface to enable the delivery of integrated healthcare information from IoT devices in an intuitive and engaging way.

Expected Outcomes:

•Data collection and reporting from multiple IoT health sensor devices
•An intuitive, actionable interface for providers as well as patients based on health metrics monitoring data
•Innovative features that will engage users towards greater health awareness leading to self-healthcare and patient-centric healthcare system

Specific Skills that Students will Develop:

•Hands-on technical IoT experience (across a variety of popular devices)
•Basic data engineering (data collection, data transformation, data integration, etc.)
•Web design and development
•Basic understanding of healthcare
•Applying technological innovation to meet patients' needs

Prerequisites: Looking for Computer Science or Information Technology students with a good experience in web development and programming. The student must have taken courses and/or developed projects that involved building websites and solving problems through programming. The most important capability we are seeking is for the candidate to be a fast learner, as this project might require one to quickly learn new programming languages and start using them for development.

Healthcare experience or background is preferred; however, it is not required.

Recommended Web sites and publications: 
   Jara, Antonio J., Miguel A. Zamora, and Antonio F. Skarmeta. "An internet of things---based personal device for diabetes therapy management in ambient assisted living (AAL)." Personal and Ubiquitous Computing 15.4 (2011): 431-440.:
   Pang, Zhibo, et al. "Design of a terminal solution for integration of in-home health care devices and services towards the Internet-of-Things." Enterprise Information Systems 9.1 (2015): 86-116.:
   Li, Xu, et al. "Smart community: an internet of things application."Communications Magazine, IEEE 49.11 (2011): 68-75.:
   Rohokale, Vandana Milind, Neeli Rashmi Prasad, and Ramjee Prasad. "A cooperative Internet of Things (IoT) for rural healthcare monitoring and control."Wireless Communication, Vehicular Technology, Information Theory and Aerospace & Electronic Systems Technology (Wireless VITAE), 2011 2nd International Conference on. IEEE, 2011.:
   Pae, YoungWoo, et al. "Using Mashup Technology to Integrate Medical Data for Patient Centric Healthcare." Future Information Technology. Springer Berlin Heidelberg, 2014. 71-76.:
   Viswanathan, Hariharasudhan, Baozhi Chen, and Dario Pompili. "Research challenges in computation, communication, and context awareness for ubiquitous healthcare." Communications Magazine, IEEE 50.5 (2012): 92-99.:
   Murphy, Judy. "Patient as center of the health care universe: A closer look at patient-centered care." Nursing Economics 29.1 (2011): 35-37.:
   Buchanan, William J., et al. "Patient centric health care: an integrated and secure, cloud-based, e-Health platform." (2012).:
   Maizes, Victoria, David Rakel, and Catherine Niemiec. "Integrative medicine and patient-centered care." Explore: The Journal of Science and Healing 5.5 (2009): 277-289.:
   Demiris, George, et al. "Patient-centered applications: use of information technology to promote disease management and wellness. A white paper by the AMIA knowledge in motion working group." Journal of the American Medical Informatics Association 15.1 (2008): 8-13.:
   Chawla, Nitesh V., and Darcy A. Davis. "Bringing big data to personalized healthcare: a patient-centered framework." Journal of general internal medicine28.3 (2013): 660-665.:
   Project description with sample screen on Page 2: http://www.urop.uci.edu/surf-it/2015_summer/Health360.pdf



 Project #4:  M2M-Tracking: Encounter-Based Collaborative Localization using M2M-Sisted Particle Filter
Faculty Mentor:  Professor Mohammad Abdullah al FaruqueElectrical Engineering & Computer Science

Description:  This project targets the development of the Encounter-Based Collaborative Tracking algorithm (ECT), which provides an offline method of counteracting the error of displacement vector by exploiting opportunistic radio encounters between many devices.

The integration of computing, storage, wireless communication, and sensing technologies in miniature packages gives rise to the Internet of Things (IoT), which is about the ability of “things” to sense and act according to the requirement of environmental conditions with Machine-to-Machine (M2M) communications. The main purpose of communications among devices is to maintain state coherency so that the users or devices can act more intelligently with the contextual information. One fundamental type of such context is the device’s or its user’s trajectory, namely “locations over time.”

Many techniques have been proposed for localization, but most assume that the observers (such as cameras) or beacons (such as GPS satellites) have been deployed at known locations to provide the reference. However, it is not always possible or practical to deploy the infrastructure for complete coverage since there will always be black-out areas. On the other hand, miniature inertial sensors (such as some combinations of accelerometers, gyroscopes and magnetometers) can detect relative motion by the IoT device itself. Thus, dead reckoning techniques, which roughly estimate the time-vary locations of pedestrians by the displacement of the motion detected [1, 2, 3], have been proposed for trajectory tracking using inertial sensors. Still, dead reckoning will not work if the starting location is unknown.

Please see http://www.urop.uci.edu/surf-it/2015_summer/M2M Tracking.pdf (link below) for a project description that includes the figure referenced below.

The idea of M2M ECT is to share the location information among heterogeneous nodes with different sensing capabilities. For example, “Thing” A and B have inertial sensors but C has real-time clock and temperature sensor. Collaborative tracking (include localization) that is Knowing A-C and B-C distance, A and B can help C to localize. C can help A, B determine time and temperature. Fig. 1 shows the proposed algorithm based on our preliminary work which demonstrates how this algorithm uses the encounter event to localize itself based on particle filter. Fig 1.a lists the possible location based on the movement the sensor estimated. While pair of IoT devices encounter and exchange their trajectories, Fig. 1.b shows the possible location set for them. Three of them are impossible locations as they are out of the boundaries based on the floor plan we have, as shown in Fig. 1.c. Therefore, we can estimate the location where both pairs of IoT devices are.

This project is designed to enable localization and ultimately trajectory tracking cooperatively as nodes encounter each other. A standard feature in the popular Bluetooth Low Energy (BLE) protocol is leveraged to sense other IoT devices that are in proximity. Compared to other algorithms requiring infrastructure, ECT provides a feasible tracking model that can be easily implemented by devices with inertial and proximity sensors only and the floor plan map data. However, as the variety of IoT devices provided, the ECT algorithm suffers from the challenge of how to scale ECT for the large number of targets by optimizing the particle filter in order to provide more accurate location estimation for devices in a power efficient way. Considering that there are N devices located in the range which are able to exchange the movement information with each other, it roughly takes N2 times M2M communication and every one device needs to estimate the possible location of N devices. Since the IoT devices provide limited computation ability and power budget, an optimized algorithm is required without sacrifice the accuracy of estimation and power consumption of devices.

Student’s Involvement and Expected Outcomes:

One student is required for this project. It is expected that the feasibility the of developed M2M ECT algorithm for multiple devices will be demonstrated through the following items:

1. The accuracy of location estimation among IoT devices (algorithm performance)
2. Responding time of the computation (algorithm complexity)
3. Power consumption of the computation (algorithm cost)

This project is conducted by using the Broadcom WICED Sense Development Kit for which the following development environment and techniques may be required:

1. Android-based mobile devices
2. Broadcom WICED SDK
3. Bluetooth Low Energy (BLE) protocol

Prerequisites: A student who is familiar with the following skills is preferred.
1. Programing language: Java or C/C++
2. Embedded system development
3. Bluetooth Low Energy (BLE) protocol

Recommended Web sites and publications: 
   DAVIDSON, P., COLLIN, J., AND TAKALA, J. "Application of particle filters for indoor positioning using floor plans", In Ubiquitous Positioning Indoor Navigation and Location Based Service (UPINLBS), 2010 (Oct 2010), IEEE, IEEE, pp. 1–4.:
   WIDYAWAN, KLEPAL, M., AND BEAUREGARD, S. "A backtracking particle filter for fusing building plans with pdr displacement estimates", In Positioning, Navigation and Communication, 2008. WPNC 2008. 5th Workshop on (March 2008), IEEE, IEEE, pp. 207–212. :
   LI, F., ZHAO, C., DING, G., GONG, J., LIU, C., AND ZHAO, F. "A reliable and accurate indoor localization method using phone inertial sensors", In In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp ’12) (2012), ACM, ACM, pp. 421–430.:
   CONSTANDACHE, I., BAO, X., CHOUDHURY, R. R., AND AZIZYAN, M. "Did you see bob? human localization using mobile phones.", In In Proceedings of ACM International Conference on Mobile Computing and Networking (MobiCom ’10) (2010), ACM, ACM, pp. 149–160.:
   SYMINGTON, A., AND TRIGONI, N. "Encounter based sensor tracking", In In Proceedings of the thirteenth ACM international symposium on Mobile Ad Hoc Networking and Computing (MobiHoc ’12) (2012), ACM, ACM, pp. 15–24.:
   “Bluetooth Low Energy (BLE) protocol”, [Online]. : https://developer.bluetooth.org/TechnologyOverview/Pages/BLE.aspx
   Project description with figure: http://www.urop.uci.edu/surf-it/2015_summer/M2M Tracking.pdf



 Project #5:  PainBuddy: Using Mobile Applications to Assist and Track Home-Based Therapy for Children Suffering from Cancer
Faculty Mentor:  Professor Michelle A. FortierAnesthesiology

Description:  Additional Mentors:
Sergio Gago, Ph.D.
California Institute for Telecommunications and Information Technology

Description:
The increasing availability of computational portable technology holds promise to transform both the assessment of pain and our understanding of its properties. We are developing a mobile application – PainBuddy – as a mobile health technology (mHealth) to help children with cancer convey the extent of their discomfort and connect them to their doctors, who will be able to monitor children’s symptoms in real time and communicate using an integrated interface. PainBuddy also provides children with training to empower them to manage their pain. PainBuddy aims to improve pain intervention and symptom management by making a diary report animated and interactive as well as adding a skills-training component. Animated characters will ask the needed questions for the diary. Children will select the appropriate answers using their mobile devices. The diary report being implemented is based on a validated pain and symptom assessment protocol.

The information collected by PainBuddy will be stored and encrypted on a dedicated server that will allow physicians to monitor and respond to pain and symptoms in real time. In addition, the program will send real-time data to healthcare providers when pre-configured symptom alerts are activated, connecting physicians and patients through a cloud-server.

Students' Involvement and Expected Outcomes:

The student mentored in this project will work in a proactive research team guided by faculty to identify available mobile and web technologies to design and build the innovative technology that will provide PainBuddy with the features needed. Specifically, the student will conduct research to design and implement a framework able to:

1. Gather data from a mobile device (running PainBuddy) and store it in a cloud database. The targeted data includes patients’ response and usage data.

2. Design a convenient and intuitive web interface, which will be used by healthcare providers to monitor children's activity.

3. Ensure patients’ privacy and data confidentiality building a secure access protocol that will be integrated in the aforementioned framework.

Prerequisites: We are looking for a student with interests and skills in both informatics and computer sciences who cares about mobile health technologies to improve wellness and quality of life. Innovative thinkers with relevant experience are encouraged to apply. Knowledge of technical languages, such as HTML, CSS, Javascript, MySQL, and database management are preferred.

Recommended Web sites and publications: 
   Ljungman G, Gordh T, Sörensen S, Kreuger A. Pain in paediatric oncology: interviews with children, adolescents and their parents. Acta Paediatr. 1999;88(6):623-630.:
   McGrath PA. Development of the World Health organization Guidelines on cancer pain relief and palliative care in children. J Pain Symptom Manage. 1996;12(2):87-92.:
   Gordon DB, Dahl JL, Miaskowski C, et al. American pain society recommendations for improving the quality of acute and cancer pain management: American Pain Society Quality of Care Task Force. Arch Intern Med. 2005;165(14):1574-1580. doi:10.1001/archinte.165.14.1574.:
   Yeh CH, Lin CF, Tsai JL, Lai YM, Ku HC. Determinants of parental decisions on “drop out” from cancer treatment for childhood cancer patients. J Adv Nurs. 1999;30(1):193-199. doi:10.1046/j.1365-2648.1999.01064.x.:
   Hockenberry M. Symptom Management Research in Children With Cancer. J Pediatr Oncol Nurs. 2004;21(3):132-136. doi:10.1177/1043454204264387.:
   Center wHO P. Cancer pain relief and palliative care in children. Indian J Pediatr. 1998;66(3). Accessed April 23, 2015.: http://whqlibdoc.who.int/publications/9241545127.pdf
   Gago S, Fortier M, Martinez A. PainBuddy – Using Virtual Characters to Improve Home-Based Therapy for Children Suffering from Cancer. In: Medicine 2.0 Conference. JMIR Publications Inc., Toronto, Canada; 2014. Accessed April 17, 2015.: http://www.medicine20congress.com/ocs/index.php/med/med2014/paper/view/2747
   F. Buttussi, L. Chittaro, and D. Nadalutti, “Bringing mobile guides and fitness activities together: a solution based on an embodied virtual trainer,” … -computer interaction with mobile …, pp. 29–36, 2006.:
   M. PALEARI, C. LISETTI, and M. LETHONEN, “Virtual Agent for Learning Environment Reacting and Interacting Emotionally,” Citeseer, pp. 3–5, 2005.:



 Project #6:  Qualoscopy
Faculty Mentor:  Professor William KarnesMedicine

Description:  Additional Faculty Mentors:
Donald Jay Patterson
Associate Professor
Informatics

Project Description: “A colonoscopy is a test that allows your doctor to look at the inner lining of your large intestine. He or she uses a thin, flexible tube called a colonoscope to look at the colon. A colonoscopy helps find ulcers, colon polyps, tumors, and areas of inflammation or bleeding. During a colonoscopy, tissue samples can be collected (biopsy) and abnormal growths can be taken out. A colonoscopy can also be used as a screening test to check for cancer or precancerous growths in the colon or rectum (polyps).”

Currently there are not any efficient and effective systems for collecting large-scale data about the quality and effectiveness of the colonoscopy procedures. However, with nationwide changes in health insurance, these measures are required in order for insurance companies to reimburse patients and doctors for the cost of the procedure.

Dr. Karnes has prototyped a system named Qualoscopy that allows for the rapid collection of data from colonoscopies during an actual procedure that can be used to measure and improve the quality of this procedure. Working with Prof. Patterson they wish to improve the speed at which the information can be collected from the clinic and deploy the resulting big data computer system to other clinics in Orange County. The ultimate goal would be to have as much of this information as possible automatically collected during a procedure.

Student Involvement and Expected Outcomes:

Student participants will be expected to conduct usability studies with nurses and doctors in clinical settings while colonoscopies are being performed. They will become familiar with the colonoscopy procedures and will help to evaluate the data collection system and provide recommendations for improving it. They will work with nurses to develop an online tutorial for the system. Depending on the success of the system they will also help to deploy the system to additional clinics. To the degree interested, students may also participate in the development of the tool, which is primarily a scalable web application with specialized data entry interfaces. HTML, D3, Javascript, and Java are technical skills desired.

Prerequisites: Students should be comfortable attending many colonoscopy
procedures in person. Students should have good interpersonal skills to work with nurses, doctors and patients in a clinic setting. Students should be available to drive between UCI’s main campus and the UCI Medical Center. Students will need to produce written documents analyzing both the results of the usability tests and tutorials for the system. For technical development, students should be comfortable writing Java code in a collaborative environment using open source tools and methodologies. Experience with Eclipse, git, and responsive design and/or a willingness to learn them is required. Students should be comfortable problem solving, should be self-motivated and able to research answers to potentially difficult technology integration problems.

Recommended Web sites and publications: 
   Colonoscopy: http://www.webmd.com/colorectal-cancer/colonoscopy-16695
   Colonoscopy Tour Video: Removal of a Colon Polyp:: https://www.youtube.com/watch?v=ewCIqAAJGPg&noredirect=1



 Project #7:  Using Wearable and Connected Sensing Platforms to Augment Pediatric Informatics
Faculty Mentor:  Professor Gillian R. HayesInformatics

Description:  This project is a part of a larger on‐going effort focused on developing interactive interfaces to help people—especially
children—understand their own health and well-being as well as to automatically and algorithmically detect features in these sensor streams. Depending on the skills and interests of the student, we will craft a specific summer project addressing one or multiple of the following research questions:

‐ How reliably do FitBits, Microsoft Bands, and other off-the-shelf wearable sensor platforms detect physical activity, physiological responses, and activities of interest in children?

‐ How can we best visualize such collected data for the children, their parents, teachers, and so on?

‐ What are potential designs for delivering tailored messaging to children and their caregivers based on sensed activities?

‐ What co‐activities can be detected when multiple people wear sensors?

‐ How does the presence of such wearable sensing devices impact the relationship and communication patterns of the people wearing them?

Student’s Involvement and Expected Outcomes:

Will depend on the student but typical activities include:
‐ sketching, prototyping, software development, and other design and development oriented activities
‐ empirical studies of people’s responses to both the hardware of the wearable devices and potential software designs
‐ alogirthmic and statistical work using data collected by these sensors

Prerequisites: Students who are interested in continuing to work on the project in the 2015‐2016 school year are preferred.

Recommended Web sites and publications: 
   Consolvo, S., McDonald, D. W., Toscos, T., Chen, M. Y., Froehlich, J., Harrison, B., Klasnja, P., LaMarca, A., LeGrand, L., Libby, R., Smith, I., and Landay, J. A. 2008. Activity sensing in the wild: a field trial of ubifit garden. In Proceeding of the Twenty‐Sixth Annual SIGCHI Conference on Human Factors in Computing Systems (Florence, Italy, April 05 ‐ 10, 2008). CHI '08. ACM, New York, NY, 1797‐1806.:
   Matic, A., Hayes, G.R., Tentori, M., Abdullah, M., & Schuck, S. (2014) Collective use of a Situated Display to Encourage Positive Behaviors in Children with Behavioral Challenges. in Proc Ubicomp 2014. 883‐893. [Best Paper, Honorable Mention]:
   Paredes, P, Sun, D., and Canny, J (2013). Sensor‐less sensing for affective computing and stress management technology. In Pervasive Computing Technologies for Healthcare (PervasiveHealth) 2013, 459–463.:
   Pina, L., Rowan, K., Roseway, A., Johns, P., Hayes, G. R., & Czerwinski, M. (2014, May). In situ cues for ADHD parenting strategies using mobile technology. In Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare (pp. 17‐24).:
   Pina, L.R., Ramirez, E., and Griswold, W.G. (2012) Fitbit+: A behavior‐based intervention system to reduce sedentary behavior. In Pervasive Computing Technologies for Healthcare (PervasiveHealth), 175‐178. IEEE.:



 Project #8:  Vestibular Rehabilitation using Wide-Angled Head Mounted Displays with Stereoscopic 3D
Faculty Mentor:  Professor Hamid DjalilianOtolaryngology

Description:  Additional Mentors:
Dr. Crista Lopes Ph.D.
Department of Informatics, Donald Bren School of Information and Computer Sciences

Project Description:
The vestibular system of the inner ear is responsible for sensing orientation and rotation of the head. Additionally, the system primarily drives reflexes to maintain stable vision and posture [1]. A normal vestibular system can adjust reflexes based on varying situations but adaptation to a loss of vestibular function may be slow and result in sensations of dizziness and vertigo [1]. Patients with vestibular dysfunction may also experience visual vertigo (VV) or visually induced dizziness due to the mismatch of perceived visual and vestibular stimuli [2]. The current standard of care for vestibular dysfunction is vestibular rehabilitation exercises to hasten adaptation [3], [4], [5]. VV symptoms are believed to be caused by an excessive reliance on visual cues for perception and postural stability in patients with vestibular dysfunction [2], [6]. Subsequently, studies have shown that VV symptoms can only be reduced if vestibular rehabilitation exercises are combined with immersive moving visuals or optokinetic stimuli [7]. Recent studies demonstrated that graded exposure to optokinetic stimuli causes adaptive changes and decreased reliance on visual cues, thus, improving VV symptom [7], [6]. Another preliminary study also showed that patients with chronic vertigo improved after repeated exposure to optokinetic stimuli [8]. Furthermore, experiments demonstrated that the combination of vestibular exercises and optokinetic stimuli showed greater improvement in VV symptoms and postural stability compared to vestibular rehabilitation exercises alone [9]. One of the main challenges of implementing optikinetic stimuli is the cumbersome requirement of large screens and equipment [10]. Additionally, the stimuli should be a realistic and interactive environment in which the patient is immersed and result in adaptation and desensitization [11]. Head mounted displays have been suggested to provide an immersive optokinetic stimulus that can be easily implemented in clinical settings in conjunction with vestibular rehabilitation exercise programs [12].

The purpose of our research project is to design and develop a realistic and immersive virtual environment and use Oculus Rift VR goggles to deliver graded, standardized optokinetic stimuli to our patients with vestibular dysfunction. The Oculus Rift goggles are a small wide-angled head mounted display (HMD) with stereoscopic 3D. The goggles are also capable of tracking orientation and head movement. They will provide an immersive optokinetic stimulus without the use of large monitors or screens. Oculus Rift VR goggles are typically used for gaming; however, we saw the potential medical application of the device. We plan to directly engage Ocular Rift into our projects as the company is based only a few miles from UC Irvine’s campus. The Donald Bren School of Information and Computer Sciences will play a key role in the design and development of a customized virtual environment. To date, no other study has used HMDs with a customized virtual environment in conjunction with vestibular rehabilitation exercises. Additionally, other possible medical applications of HMDs in diagnostic and screening will be explored since the vestibular system is also involved in other common conditions such as motion sickness and migraines.

Students’ Involvement and Expected Outcomes:

Students will be involved in the majority of the research process. They will be expected to conduct literature searches of relevant information. Students will be asked to investigate possible areas of interest and to generate solutions to potential challenges. Students will be challenged to create methods of recruiting possible subjects for the study. Additionally, they will be involved in the design of the VR environment. Students are expected to gain useful skills in organizing data and conducting clinical trials. Students are expected to be involved in the writing and reporting process. Students will be asked to analyze data and carry out statistical calculations.

We hope that the students will learn skills that will enable them to design and develop their own research and experiment. The multidisciplinary nature of the research creates opportunities for students to be exposed to different topics and industries. It also fosters a sense of teamwork between people of different backgrounds because completion of the task will not be possible without the cooperation of all parties. We hope to foster students’ interest in the applications of technology in healthcare.


Prerequisites: We prefer students with either an interest in medicine or computer science. Students are required to at least have a health science or computer science background. Students with previous experience in clinical research or programming skills will be preferred. Finally, students who can commit time will be included in the project.

Recommended Web sites and publications: 
   Product website: http://www.oculusvr.com
   References:

1. Flint, P.W., & Cummings, C. W. 1, Cummings otolaryngology head & neck surgery. 2010, Philadelphia, PA: Mosby/Elsevier.

2. Bronstein, A.M., Visual vertigo syndrome: clinical and posturography findings. J Neurol Neurosurg Psychiatry, 1995. 59(5): p. 472-6.
3. Brown, K.E., et al., Physical therapy outcomes for persons with bilateral vestibular loss. Laryngoscope, 2001. 111(10): p. 1812-7.

4. Horak, F.B., et al., Effects of vestibular rehabilitation on dizziness and imbalance. Otolaryngol Head Neck Surg, 1992. 106(2): p. 175-80.

5. Whitney, S.L., et al., The effect of age on vestibular rehabilitation outcomes. Laryngoscope, 2002. 112(10): p. 1785-90.

6. Guerraz, M., et al., Visual vertigo: symptom assessment, spatial orientation and postural control. Brain, 2001. 124(Pt 8): p. 1646-56.

7. Pavlou, M., et al., The effect of repeated visual motion stimuli on visual dependence and postural control in normal subjects. Gait Posture, 2011. 33(1): p. 113-8.

8. Viirre, E. and R. Sitarz, Vestibular rehabilitation using visual displays: preliminary study. Laryngoscope, 2002. 112(3): p. 500-3.

9. Pavlou, M., et al., Simulator based rehabilitation in refractory dizziness. J Neurol, 2004. 251(8): p. 983-95.

10. Pavlou, M., et al., The effect of virtual reality on visual vertigo symptoms in patients with peripheral vestibular dysfunction: a pilot study. J Vestib Res, 2012. 22(5-6): p. 273-81.

11. P.J.S. S.L. Whitney, K.B., J.M. Furman, J.L Jacobson and and M.S. Redfern, The Potential Use of Virtual Reality in Vestibular Rehabilitation. Preliminary Findings with the BNAVE Neurology Report, 2002. 26: p. 72–78.

12. P.J. Sparto, J.M.F., S.L. Whitney, L.F. Hodges and M.S. and Redfern, Vestibular rehabilitation using a wide field of view virtual environment,. Conf Proc IEEE Eng Med Biol Soc, 2004. 7: p. 4836–4839.
: