Abhigyan Kaustubh AI. MR. UX.

Blog Archives


A HoloLens application for disaster response teams.

Introductory video that briefly describes the primary use case of the application.


The main inspiration behind this project was to empower disaster response teams to quickly triage and respond to an emergency, rebuild faster and restore affected areas more effectively, and at the same time take care of the most critical pain points that are faced by the response teams.

DisastARcons allows responders using Microsoft HoloLens to assess damage as needing to be addressed for safety concerns, and to triage sites affected after an emergency event by visually inspecting and marking areas that need attention, or that are health/safety risks. This information is synced via the cloud to the Incident Command System that will use these data points to decide and lead deployment of skilled teams to the area. Subsequent responders will use DisastARcons to find and resolve areas which were tagged by earlier teams in a more efficient manner than conventional methods.

The project was born at the SEA-VR Hackathon IV in Oct 2016, for which we won the Best Humanitarian Assistance Award. Since then, a subset of us have been working on this developing it further for a client.

Role: Project Lead/Product Manager, UX Researcher, Designer, Presenter

Key Activities: Research, Ideation, Prototyping, Design (Environment + Interaction + Key Features Hierarchy), Data Visualization, Coding, Demoing, Presentation and Evangelization, Client Support

Initial Team (at Hackathon): Abhigyan Kaustubh, Amanda Koster, Alicia Lookabill, Steven Dong, Tyler Esselstrom, Drew Stone, Evan Westenberger, Jared Sanson, Sebastian Sanchez

Final Team: Abhigyan Kaustubh, Amanda Koster, Alicia Lookabill, Steven Dong, Tyler Esselstrom, Drew Stone

Timeline: Oct’2016 – Present

Tools: Tableau, AWS, Unity, Balsamiq, Blender, Adobe Photoshop, Illustrator, Premier, Visual Studio, Hololens + its SDK, Asana

Website: http://disastarcons.com/



Process Flow - Page 1 (1)



Ground work + Research

Our process started with high level analysis of the problem space – why did we care? We realized that it was a space that didn’t have optimal solutions, and solving for the problems in this space meant saving several lives in areas that were affected by natural disasters. This allowed us to inform our emotional motivation, which got our team fired up to develop a solution.

Secondary Research

The second part was defining clearly the actual problem space and our value proposition/ solution, and to gauge its viability and short term and long term adoption. For doing this well, several things needed to be done (in series and in parallel):

  1. Research organizations in this field in terms of their needs, focus, specialties, customers and pain points.
  2. Identify the target customer whom we will be designing our product for: what will be their big problem that we will be fixing with our product?
  3. Identify scenarios in which you see the target customer using our product. Identify one main scenario where our product will be indispensable to them, and understand the frequency of such a scenario occurring.

Meanwhile, in parallel,

  1. Envision how could an organization help out in a disaster affected area?
  2. What are the top 3 things that need to be done in such a scenario, and what is the best way of doing that – be completely agnostic to any technology or process. Understand the need at the most fundamental level, and then reason how best to solve this.
  3. Are we developing a mixed reality solution just because it is a VR hackathon, or is there really a strong need that can only be met by Mixed Reality application at the highest level of efficiency?

We started with the users for whom we were going to design our application. After considering several organizations that were involved in this field, their needs, focus, specialties, customers and pain points, we decided to narrow our target customer to FEMA.

We believed it was vital that we were clear about the above aspects of the projects before we dived into design and development. Hence, we iterated the above exercise a couple of times to gain clearer comprehension.


Primary Research

We used two methods for this from the resources that were available: Interviews, Observation through Role Playing

The purpose of the interview was to gain (and cross check) a deeper comprehension of our secondary research, and to get a sanity check from people who were closest to our users, Additionally, none of my team members had been a in the scenario of the intended user before (and had no way of doing so with the resources and time we had), and had limited experience in the field of mixed reality.
Hence, we interviewed experts from the fields of Disaster Management, Accessibility and Mixed Reality.

We interleaved this process with Observation methodology, which we implemented via Role Playing. This enabled us to find more pertinent questions to ask experts as we understood the scenario in the context of a potential user.

We gained the following insights:

  1. A lot of people might be initially enamored my the mixed reality application just because it was “cool” and there was a Hololens involved. This would bias the users feedback on how useful they would find the app, especially if they are using it when in a disaster affected area.
  2. Movies like Iron Man which depict augmented reality and are a significant initial motivation for people experimenting in this field focus mainly on appealing to viewers rather than usefulness to the actor using it. For eg., the field of view should should be as minimalistic as possible to reduce the cognitive load.
  3. The scenario where the application will be used will be hostile and might limit accessibility for users. There should be multiple modes of interacting with the application for critical features.
  4. Along with focusing on minimalism and accessibility, the interface should be as universally comprehensible as possible to understand

The above process allowed us to come up with the following outline for our project (described from target user’s perspective):

  1. Situation:
    1. An 8.8 earthquake happens causing a devastating tsunami
    2. 1st responders have performed search and rescue.
    3. You are a member of FEMA (Federal Emergency Management Agency), responsible for the coordination and response to a disaster that has occurred in the United States and that overwhelms the resources of local and state authorities.
  2. Problem Statement: A Government Accountability Office (GAO) 2015 audit report found:
    1. Response capability gaps through national-level exercises and real-world incidents
    2. Status of agency actions to address these gaps is not collected by or reported to the Department of Homeland Security or Federal Emergency Management Agency (FEMA).–Anthony Kimery, Editor-in-Chief, Homeland Security Today.
  3. Proposed Solution: DisastARcons
    1. DisastARcons uses the Microsoft HoloLens for damage assessment by visually inspecting and marking areas that need attention or that are health/safety risks.
    2. DisastARcons increases efficiency in capturing and sharing accurate data AND measures the time between identification and resolution.
  4. Why Hololens?
    1. Always in front of you: The HoloLens utilizes the user’s entire field of view vs. most devices, such as a cell phone that uses a limited rectangle of view and is dependent on the user’s way of holding the device.
    2. Example use case: For the second shift of maintenance workers, all data will always be easily accessible when relevant.
    3. Hands free
    4. Highest fidelity: HoloLens can do 3D, 360° (4π Steradian) construction of its surroundings.

Gaining Product Clarity

Integrating the above, we get the following high level scenario:


High Level Scenario - Page 1




Following the above process, we scoped our project in terms of main goals and extension goals, as follows.

Main Goal:

  1. To build a Hololens application that has the simplest possible interface that allows the user to mark hazards and assign severity ranking to them with accuracy and precision based on the user’s inspection of their surroundings.
    1. The marking of hazards will take place through tagging, where appropriate holograms will be attached to the affected area.
    2. The severity of the hazard will be indicated by the color of the hologram.
  2. Safety mechanism: Since the user will be using this in a dangerous area, there should be a way for the user to call for help (911), easily & intentionally.

Extension Goals:

  1. Establishing a connection with the ICS (or a remote server) to populate data collected from different field agents.
  2. Update the information points on every hololens in the field
  3. Send the information to ICS for analysis
  4. Craft an interface for the ICS to analyze the data quickly and give out directives to field agents.
  5. Add to the existing backend of ICS that allows them to utilize the hololens data points along with others in a seamless fashion.


The ideation process involved condensing data from results from different research methods/activities like roll playing, using custom hologram app in Hololens, 3D construction, expert interviews, concepts in accessibility, etc.

We used this to play around with different interface ideas and interaction methods while trying to refine the use case to utmost leanness.

Based on our results, we came up with the following flow:

The Disasters - Phase 1 - Page 1


Phase 2- ICS + Maintenance Personnel POV - Page 1



The ideation process was translated into a UI flow for the app’s interface – with special emphasis on simplicity and ease of access.


 UI Flow - Page 1


We build a Mixed Reality Hololens application that allows the user to apply persistent tags to different things in their real environment and rate the severity level, while recording and transferring the most accurate set of data points describing hazards (that can be later located by other FEMA agents and attended to) to the remote Incident Command System, which is analyzing all the input data streams and giving the users prioritized and relevant information on their field of view, enabling them to restore the most critical affected areas while remaining safe and keeping a track new potentially hazardous developments in their neighborhood.

Next Steps

We are currently working on our primary extension goals (which is now are main goal):

To build the interface for the ICS and establish efficient data transmission in-between field agents, and with the remote Incident Command System.

The process for that can be best represented by the following flow chart:

Building a functioning Dashboard for SC (Front end + Back end) - Page 1 (1)


The prototype for eventual incident command center’s interface to get an overview of various things happening in the affected area is as follows:

FINAL preso

Phytoplankton Trends

Discovered 2 types of phytoplankton by using machine learning and data visualization on Flow Cytometry data.


This is a data science based project that is going on at UW Seattle. The work on this project was performed as a Capstone project primarily with the goal of gaining better comprehension of Marine biology by analyzing the Flow Cytometry data available.

Oceanographers use Flow Cytometry to measure the optical properties of a given sample of water through radial dispersion. This is done by attaching Flow Cytometers to the bottom of ships that conduct research, thus enabling coverage of a vast body of water.

We procured Flow Cytometry data obtained at 3 minute time intervals (might change), and used suitable clustering technique to identify regions in the water body that have similar trends in microscopic life form population.

To learn more about the project, please see the wiki.

Roles: Program Manager, Developer (Machine Learning + Data Visualization), Presenter

Key Activities: Acquiring client, gathering requirements, setting goals, creating a roadmap of deliverables, coordinating events with the stakeholders and ensuring that deliverables are on time, literature review, coding, data visualization, presentation


The primary contributor of this project’s repository are:

  1. Abhigyan Kaustubh
  2. Elton Dias
  3. Tanmay Modak

This repo was compiled and documented by Abhigyan Kaustubh.


  1. Bill Howe, eScience Institute, UW CSE
  2. Sophie Clayton, UW Oceanography
  3. Jeremy Hyrkas, UW CSE
  4. Daniel Halperin, UW CSE
  5. UW Oceanography Researchers (eScience Institute)

Timeline: Dec 2014 to Jun 2015 (7 months)

Sponsor: UW eScience

Poster for Presentation


Screen Shot 2016-03-29 at 5.08.22 AM

fMRI Brain Scan

Predicted the object that the person is looking at by using Machine Learning on the fMRI data of their brain scans.

Tools: Python, Scikit-Learn

Author: Abhigyan Kaustubh

Role: Machine Learning Developer

Key Activities/algorithms/tools: Data Sanitization, Principal Component Analysis, Data Visualization, Machine Learning Algorithms

Timeline: 10 weeks

Stock Market Swings Prediction


Explored the possibility of predicting Stock Market swings using sanitized Twitter data as first venture into Data Science.

Team Members: Abhigyan Kaustubh, Brennen Smith, Padma Vaithyam, Wenxuan Zheng

Tools: R, Excel

Key Activities: Data sanitization, regression, sentiment analysis, lexical analysis, TF-IDF, Data Visualization (heat maps)

Timeline: 10 weeks


Statistical data concerning opinions and emotions of people have been accurate in some instances in gleaming what is going to happen in the immediate or late future, when understood and applied correctly. Here, we have adopted the ‘Data Science’ based approach to verify our hypothesis that there is a correlation and possible causation between the rise and fall of the stock prices of a particular company and their corresponding tweets on Twitter. Through our research and analysis, we incorporated three different methods for sentiment and lexical analysis with the intension of divining accurate predictions for 5 major companies using the tweet keywords generated from January 2008 till March 2010.

The resulting figures and visualizations generated in the process showed that there was no correlation between the stock market movement and the twitter stock handle that we used.

The graphical results (graphs and heat maps) of the analysis can be viewed here.

Learn More

Textual Analysis

Conducted exploratory data analysis using data visualization for understanding the nature of membership on fan-fiction sites.(Tableau, Python, CSV)


A fanfiction is just a fiction written by fans of a TV series, movie, novel, etc., using existing characters and situations to develop new plots. This allows a lot of freedom and provides the individual with a sense of safety in which they are free to create different environments, situations, social constructs, relationships, activities, etc. which might be very different from the norm of the society that the individual (fan) is a part of. It thus provides a very fertile ground for exploration of new ideas, from which the existing societies can be effected. In such a flat hierarchy where the only merits of consideration are communication and creativity, it creates an interesting environment where users of different experience learn from each other and contribute to this domain. Thus, studying such a social structure in the digital arena could be beneficial in terms of understanding human behavior in such environments, and its effects can then be understood and learned from for other situations.


As a member of a Directed Research Group in the Human Centered Design Engineering Department at University of Washington, I coded the reviews in fanfiction sites based on a set of attributes and performed exploratory data analysis using visualization to find and verify trends and correlations, thereby reinforcing the conclusions that we derived about the nature of mentorship in fanfiction sites.

Infant Mortality in Africa

Exploratory data analysis using data visualization for finding trends/factors concerning infant mortality in Africa.


Conducted exploratory analysis of World Economic Data through effective data visualization and identified strong correlation towards possible factors contributing towards infant mortality in Africa.

For this project, I compiled the entire procedure along with my calculations and findings, which can be viewed here.

For accessing the Tableau file, please click here.

Learn More

Crisis Clinic Project


This project explores the different methods used to analyze the calls received by the Crisis Clinic across geography and time in order to find useful insights in terms of discovering important trends, correlations and possible causations.  We analyze the call trends of 4 different lines: Crisis Line, Teen Link, Recovery  Line,  and  211,  and  specifically  focus  on  the  most common problem areas and needs, which we have analyzed with respect to geography in terms of ZIP codes and cities, and with respect to time from January 2010 to May 2014.  Our  exploration  with  the  data  shows  that  it  is  possible  to extract  useful  information  on  the  call  behavior  of  the  callers across  geography  and  time  through  visual  analysis.  Based  on these results,  we  explain  how  managerial  decisions  specifically relevant  to funding  of  the  Crisis  Clinic  can  be enhanced, and also focus on the aspect of increasing public awareness through hosting the final set of visualizations as a dashboard on Tableau Public.

For a quick look describing the process and results, click here. For the detailed report, please click here.

Role: UX Researcher, Designer, Data Wrangler, Presenter

Key Activities: Literature review, data curation, interviews, prototyping and user evaluation, data visualization, usability study, presentation

Team Members: Abhigyan Kaustubh (AK), Emily Greenberg, Lana Pledger, Rijuta Trivedi

Timeline:  Apr 2014 – Jun 2014 (10 weeks)

Tools: Excel, Tableau, Powerpoint, SQL


Crisis Clinic is at the heart of the Seattle-King County safety net providing a broad array of telephone-based crisis intervention and information and referral services. For many people in emotional distress or needing community services assistance, they are their “first call for help.” Every year, the Crisis Clinic receives a huge number of phone calls from King County residents in need of emotional support and community services. It has four main programs through which it provides its services:

  1. The 24 Hour Crisis Line offers emotional support to those in crisis or considering suicide;
  2. King County 2-1-1 offers information and referrals to community services based on its database of more than 5000 services;
  3. WA Recovery Help Line provides a state wide service offering emotional support and linkage to substance abuse, problem gambling and mental health services to anyone in Washington State;
  4. Teen Link offers emotional support and assistance to teens by providing a teen-answered help line.

As a nonprofit organization, Crisis Clinic depends on the financial support of local government, United Way of King County, corporations and foundations, and the generosity of donors to keep its doors open and provide services.  In addition, it also serves as a central point for crisis resources that includes training, outreach, and a bridge to other organizations that may provide specialized support.

In this project we investigate the different ways of making interactive visualizations of the callers’ dataset to gain insights into its presence in the King County area, and also explore and understand patterns and trends in the calls they receive across geography and time.  We envision that these visualizations and insights will allow the staff at Crisis Clinic to better allocate resources in a targeted way, more effectively communicate the impact of Crisis Clinic to current and prospective funders, and allow the general public to better understand and appreciate its work in the King County area.

Learn More