NPS & AU Grassroots Distributed Innovation Sprint: An Interview with TSgt Daniel Hulter

NPS & AU Grassroots Distributed Innovation Sprint: An Interview with TSgt Daniel Hulter

Written by SSgt Austin Wiggins

To accelerate change within the Air Force and Space Force, we must be able to equip potential innovators with education that promotes innovation at the lowest levels. Recently, CyberWorx had the fortune of hosting representatives of; Air University (AU), Naval Postgraduate School (NPS), Army 75th Innovation Command, Defense Innovation Unit, Army Space and Missile Defense Command, U.S. Navy, MIT Lincoln Laboratory, Tecolote Research, and Lokahi LLC. Despite being from disparate groups, they had the common goal of fostering grassroots and distributed innovation through education and connecting academic initiatives to address operational problems. The common understanding in the group; something had to change.

To uncover what made this session interesting and to highlight some hopes for the future of this project, I interviewed the facilitator of the event, CyberWorx member TSgt Daniel Hulter.

What were your expectations for the group going in?

I wasn’t sure what to expect from this group. It was clear they had a lot of topics they wanted to address regarding educational institutions’ role and needs as part of the defense innovation ecosystem. What wasn’t entirely clear was which of these topics were universally felt or what the priorities were. So, I knew that one of the first things we had to accomplish was getting on the same page about what needs were shared and which were more confined to a particular group.

One thing I did expect was that the conversation needed to be driven forward in order to move from observation and analysis to action. We generally have a high appetite for exploratory discussion, especially those in academic circles, and that can make the transition to concrete action challenging. Some participants advocated for less structure to the session so that loose discussion was allowed to happen freely for longer periods while they were together, something we attempted to balance against the desire for clear courses of action. 

In what ways were those expectations positively subverted?

The group performed very well with the more structured portion of the workshop, which I won’t say subverted my expectations, but one of the things I was nervous about was whether they would appreciate being time-boxed and asked to perform very specific analytical tasks.

There were particular exercises that the group took to extremely well, for example their exploration of the defense innovation ecosystem through the lens of analogous systems. I remember feeling relieved that I wasn’t having to goad or micromanage them to do the exercises, which can be the case with certain groups.

What were your thoughts about the group’s objective to enable grassroots innovation through education?

I was slightly caught off guard by the inclusion of the “grass roots innovation” language in their problem statement. I realized that had I been more involved in the crafting of that problem statement, I might have been better prepared to speak to it. Instead I simply put it in front of participants and saw how they chose to interpret and include it in their exploration of the topics that came up.

I do think that the inclusion of that language speaks to the motivation of some of the organizers of this event to try and approach innovation education from a new angle, looking at distributed effects rather than targeted, exclusive, role-based impacts. But one thing we did run into there was that some of the individuals who helped craft that language did not join us for the session itself, which further limited the degree to which that problem statement guided participants in the direction they took.

I think there’s an opportunity in the motivation within this community to do a bit more to try and reimagine what innovation education could look like. It feels like what we accomplished in this day and a half was some broad scoping and a picture of some of the primary systemic mental models that are driving the current shape of the education system. One of the NPS hosts mentioned a paradigm shift around how we even think about innovation education and that’s really interesting.

But this wasn’t a workshop with a precise focus, it was much more exploratory. That question of how we might subvert or mutate or evolve the existing model to create something that is more aligned with the need for ground-level, grass-roots innovation is something that I think deserves significant time and attention and might be an interesting next step.

What are some outcomes you think that the group came to?

This group first achieved a few rough sketches of the current perceptual foundations of innovation education development and delivery for AU and NPS. Using that as an anchor, they thought about ailments within that existing system. They also arrived at a few measures that might surmount impediments to their primary value being delivered to more of the force. The proposed measures they came to at the end of day one can serve as launching points for experiments for these organizations to run, either together or separately.

Much of the time spent together in this session was also spent in discussions. I am confident that these interactions had a positive impact on the participants as they navigated the issues that came up during exercises on day 2, which was almost entirely unstructured. One of the common themes I heard come out was that this exact type of thing–coming together in one place and attempting to make sense of disparate and shared experiences across the defense education community–was something that ought to happen more frequently and perhaps with an even wider variety of participants. I think this may turn out to be one of the more significant outcomes, as it has the potential for continued delivery of insights, increased alignment, and sense-making, and increasing the likelihood of success for experiments that emerge from engagements like this.

What would you like to see from the organizations in this group in the future?

In the future, I would like to see these organizations do a few things:

Come together more frequently and invite more participants in, both with the type of forum that we created with this event and with more continuous connection mechanisms facilitated by platform selection/development, community building, and community management.

Spend more time thinking about and critiquing the perceptual foundations of their current organizational structures and strategies and put some dedicated time into seeing how those structures might be reimagined and redesigned.

Move forward with the experiments identified in the session as potentially high-value by giving them time and space in their own design efforts.

For those organizations who are seeking to empower grassroots innovation, do you have any recommendations?

My number one recommendation is to identify what is within the adjacent-possible for those you are expecting to innovate, for your particular context.

Every organization, unit, and team in the military has its own set of unique conditions and constraints that mean that the approach to enabling innovation and the form that that innovation takes has to be adapted to that level, a task which requires significant time and skill. Rather than seeking to identify standardized adaptations that might work for every circumstance, or even focusing on primary constraints of the larger system, I think it makes more sense to widely teach the very few underlying principles and practices that are universal. To try and affect a culture of acceptance, safety, and experimentation, and spend your remaining energy and resources on enabling that discovery and adaptation to happen in all the places it needs to happen. 

If we are talking about true ground-level innovation, speaking as someone who has spent a majority of my career at the tactical level, I think that means enabling people to be sense-makers and practitioners within their own context. Those practitioners can be plugged into more strategic and operational-level scaling and implementation efforts, so creating systems that transition and scale outcomes should be a priority for organizations as well. In my opinion, empowerment of grass-roots innovation doesn’t start with the question of “are we getting things across the finish line” from a top-level perspective. It starts with creating the opportunity for individuals to contribute and have actual impact on their immediate environment.

Another recommendation that I have is to tell more stories of failure. One thing that came up during our workshop was the fact that shame is a powerful motivator. We are never so innovative as when we are ashamed because we allowed a terrible disaster to occur. The most incredible transformations are possible when we actually reflect on the current state of things and the ways in which we are failing (either by choice or when forced by some kind of incident).

A mistake I see a lot of organizations making right now is telling too few stories about failure. We regularly identify the need to normalize and embrace failure within our culture in order to spur innovation, but we still don’t talk about it enough, especially in the context of innovation (I think largely because we feel the need to say we’re succeeding in order to keep getting funding and support from our organizations and leaders). Lots of efforts at enabling innovation are failing, and we need to be open and honest about that in order to create space and energy for the next pivot.

For the record I am guilty of this exact thing. Storytelling and marketing are crucial components of success in all ventures, especially innovation. Building coalitions requires that we convince others of our likelihood of future success, and claiming present or past success is a potential pathway to that, but it suppresses stories of failure that might allow us to be ashamed enough of the status quo enough to drive the change we need.

Do you have any closing thoughts?

It was wonderful to see disparate players from across the innovation ecosystem come together and navigate these difficult topics together, in both structured and unstructured ways, and I hope to see this exact type of thing happen more often.



Thank you Daniel Hulter, for taking your insights into the session. With your participation, we are one step closer to delivering real qualitative change to grassroots and distributed innovators throughout the Air Force.

For more information on this project, visit the NPS & AU Grassroots Distributed Innovation project webpage.

CyberWorx 2021 Quarter 1 Newsletter

CyberWorx Newsletter 2021 Quarter 1

Team Comments

The chemical company, BASF, used to have an advertisement that said in part, “…we don’t make the cooler, we make it cooler.  We don’t make the jeans, we make them bluer,” to highlight how their company improved the experience, performance, or durability of products for the people who use them.  Much like the ubiquity of the products presented by the advertisement, information technology now underpins nearly every aspect of daily life both at work and at play.  In a corollary, AF CyberWorx exercises their human-centered/user experience design, lean startup, and agile toolsets on the application of technology to improve experience, performance, and efficiency of mission execution, preparation, and support, while helping prepare the Air Force for new/emerging technologies and aiding industry in better understanding operational needs.  

Our Cyber Risk Ecosystem, presented in previous newsletters and currently in integration/deployment with NORAD/NORTHCOM, provides cross-domain risk awareness to help commanders understand and mitigate cyber threats to missions in other domains.  The recent low/no-code initiative for 16th AF demonstrated the efficacy of equipping unit-level Airmen with tools to automate and improve their own processes such as aircrew training and mobility readiness and awareness, network account request automation, contracting officer/small business interactive marketplace, personnel functions, and training instructor scheduling.  Our team has helped the F-35 JPMO improve mission communications planning, and we see the potential for that work to expand across multiple platforms and across the Joint Services. 

At AF CyberWorx, innovation isn’t just about technology, but a balance of meeting user needs within organization constraints with the right technology to make Airmen more effective at the full spectrum of activities needed to deliver Joint capabilities from mission support through mission preparation and execution.  At AF CyberWorx, we don’t make the Commander’s decisions, we help make them better informed and more quickly.  We don’t load the cargo, we make tracking it more efficient and accurate.  We don’t fly the mission, we make the planning easier, faster, and more precise.

Lt Col Helgeson, Deputy Director

CyberWorx 2021 Quarter 1 Project Portfolio

Genisys

The Air Force Test Center wants to accelerate the data processing pipeline to move useful, actionable test range data to decision makers faster. CyberWorx is working with Hill AFB’s EDDGE software team to design and develop MVP screens for an application that streamlines access to a centralized data platform.

ai cyber wingman

The Cyber Wingman project is focused on enabling an AI-based mission commander digital assistant that can compile and process large amounts of data into actionable information, including similar past events and recommended courses of action. The CyberWorx team is working with MIT and Lincoln Labs, planning and coordinating research and interviews. 

f-35 enhanced ui

CyberWorx is providing expertise to a joint work force of Navy, Air Force, and Marines with a deep dive into task analysis as they update their mission planning software that aligns to a next-generation framework with a modern user interface.

USSF Enlisted talent development

CyberWorx is working with USSF to identify how to best develop fully qualified enlisted Guardians that will be ready to meet future Space Force challenges.

leading user experience for everyone (Luxe)

The Chief Experience Officer of the Air Force is spearheading an effort to improve the form and function of existing enterprise-wide applications. The LUXE project underscores the importance of UX when developing products and services and highlights the need for wider adoption of UX practices across the Air Force. The CyberWorx team is providing design consultation and resources for education and training on human-centered design methodology as well as the redesign of some of the Air Force’s most egregiously designed applications.

kinderspot

AF Spark Tank finalist, Kinderspot, will offer families enrolled in Child Development Centers the ability to sublease their child’s spot to other eligible families while temporarily out of the area, Airbnb-style. The CyberWorx team worked with industry partner Oddball to optimize the user experience through the research and prototyping phase.

usafa crsp

 CyberWorx facilitated a design sprint 1-2 March to investigate how to develop a Consolidated Resourcing Sight Picture for the USAFA Financial Management office that would enable effective use of resources and provide better decision support to the Commander.

mission assurance with spectrum

We recently hosted nearly 150 government and industry reps to answer the question of how we might use the variety of Electro Magnetic Spectrum options to provide mission assurance of DoD operations. CyberWorx partnered with the Air Force Spectrum Management Office, the 350th Spectrum Warfare Group and NineTwelve, a public-private partnership in Indiana, to run a 3-day virtual event to connect emerging Spectrum technology with DoD stakeholders/users.

weather ai explainability

The Weather AI project will include explainable AI in future weather forecasting models with an emphasis on overseas locations with limited weather radar coverage to develop high-fidelity and explainable products. We are teamed with MIT and Lincoln Labs researchers on approach and questions for user testing in the effort to better inform the development of next-generation forecasting tools.

the other airmen

The Other Airmen initiative (pilot #1) successfully concluded on 11 March as six Airmen “Citizen Developer” teams pitched their solutions to 16th Air Force Commander, Lieutenant General Timothy Haugh at the Rocky Mountain Cyber Symposium. We are continuing work with the Air Force CIO office and 16th AF to move forward with finding ways to provide the capability to Airmen across the enterprise. 

ussf cyber officer force development

We are assisting the Space Force in developing a more agile, digitally-focused career development path for their cyber officers. 

21st century drill

Twenty-two participants from across government and industry attended a virtual design sprint from 26 January to 19 February to refine problem areas from information gathered through Guardsmen surveys about the collaboration and communication challenges they face getting ready for their drill weekends and annual training. A working group will continue work on the feasible solutions developed during the event to build the Air National Guard drill weekend of the future.

airmen leadership qualities

CyberWorx is collaborating with HAF/A1H as they work to make improvements to the Air Force’s officer and enlisted Evaluation Systems. Through facilitating AF-wide focus group sessions, the data gathered will inform decisions for evaluation system transformations in the future.

usafa superintendent’s honor system review

CyberWorx facilitated a design sprint for the Superintendent-directed Honor System review. Six teams of participants identified 31 ideas to improve the System and encourage cadets to embrace the Honor System as an ideal to aspire to, building leaders of character for the future.

winter is coming

The 319th Reconnaissance Wing and the University of North Dakota have teamed up to build a culture of continuous innovation in the Airmen of Grand Forks AFB, ND. We facilitated a virtual education and design event for them, focusing on improving the living conditions for Airmen living in base dorms by creating an environment that encourages engagement, curiosity, creativity, and inclusion.

norad-usnorthcom (n-nc) innovation and culture

 Government and industry participants explored how to increase the digital literacy of N-NC and create a culture that embraces an innovative mindset.

The Other Airmen Spotlight

The Other Airmen Spotlight

The Other Airmen experiment aims to demonstrate citizen developers from the Air Force and Army can create useful applications using low code/no code capability. The teams are transforming their use cases into working applications to present to the 16th Air Force Commander Lt Gen Timothy Haugh in March 2021.

24 Feb – The process to pull and compare information for a unit member – military or civilian – is labor-intensive.  It requires data to be cross-referenced from multiple sources including spreadsheets and specialized systems. Currently, personnel acting as Unit Deployment Managers, Unit Training Managers, Commander Support Staff, and other positions need to collect this information, compare it for accuracy and currency, and route it up to appropriate leadership.

Two Citizen Developers are collaborating on a unit-level application that imports and aggregates complex spreadsheets such as the Unit Manning Document, Unit Manpower Personnel Record, Alpha Roster, Gains and Loss rosters, and Civilian Roster pulled from personnel systems. They can also manually modify personnel records using “detail screens” which displays specific information on an individual.

The application suggests changes and allows users to manually verify where changes or inaccuracies exist. User access to system data is role-based, granted as their account is setup by an administrator. Role-based access maintains data confidentiality while still providing appropriate access for specialized personnel to gather information quickly for mission-essential tasks.

The application is currently functional with access to real data. The Citizen Developers are adding additional features from their “want to have” list. They are also gathering more information to provide a more robust example for the March presentation. They would like to import data directly from system databases, but current bureaucratic controls require manual spreadsheet imports. This application is a perfect example of individuals identifying a need for a tool that reduces time- and labor-intensive tasks to improve efficiency and accuracy. The sleek design also highlights that even amateur developers can develop relatively sophisticated applications that perform well and look professional in a relatively short span of time.

A screenshot of the role-based token creation for the manning application.

31 Dec – Today’s Spotlight focuses on a Market Research and Solicitation tool designed for contracting officers to efficiently perform market research on a company submitting for a Federal Commercial Solutions Opening. Companies could upload pitch decks including a descriptive video and whitepapers to support the submission. The tool could foster continued engagement with an organization after an event or project is done.

Currently, data in the MVP is entered manually. It includes links to the beta.sam.gov announcement and any social media platforms the company uses. The citizen developer is seeking APIs and permissions to automatically pull company information using the DUNS identification number, which is a unique nine-digit number required for any company to register with the Federal government for contracts or grants. Other potential sources for automatic information include usaspending.gov for contract identification, the Air Force Installation Contracting Center business intelligence unit for Federal Supply Codes and Product Service Codes, and ready-made prospective sheets for the company. The end goal is for the tool to minimize manual entry and provide users current information.

The citizen developer is an Air Force contracting officer currently participating as an Education and Industry fellow as her full-time position. She built the MVP on her own time on weekends after completing her training. The short time-frame from idea to MVP demonstrates how quickly a citizen developer can develop a solution that improves their productivity.

10 Dec – Of the citizen developers volunteering their time and efforts, one has accelerated the development timeline and delivered a prototype. War Skills and Military Studies instructors need a more efficient and reliable way to schedule a complex class coverage. The current process makes a team of instructors unavailable for teaching while they manually build the schedule. That draft schedule is visually compared to each instructor’s leave schedule and the class schedule. If a mistake or unforeseen change occurs, the team has to rush to make updates and disseminate the new schedule to the instructors as quickly as possible. 

The use case involves a scheduling application that allows a single user to easily add classes, instructors, and locations as well as de-conflict instructors with scheduled leave. With the push of a button, the system will process the information and assign instructors to classes. Instructors won’t be scheduled if they’re on leave, nor scheduled to teach two classes at the same time. The system accounts for travel time between class locations. The application quickly produces an equitable schedule and allows the Scheduling Office the ability to easily and quickly make changes and disseminate an updated schedule within minutes instead of days.

That is the power of Low Code/No Code that The Other Airmen experiment is assessing for wider adoption. Citizen Developers know their pain points. Given the tools to develop their own solutions, DoD personnel can quickly improve unit efficiency.

11 Nov – Personnel responsible for member training requirements need a better way to track training data to reduce overhead and create a comprehensive picture of a unit’s ability to support the mission. Currently, commander support staff, unit training managers, and unit deployment managers have to sift through information from an array of sources including ADLS, TBA, IMDS, the Army’s DTMS, and various spreadsheets to piece together an accurate report. Users deserve a more efficient way to track training than cross-referencing 10 different systems to extract required data and manually input the information into a spreadsheet.

Several citizen developers are working on training tracker prototypes to streamline the current tedious process.  An automated process will reduce the amount of time spent manually entering data, improve accuracy, and allow Airmen to focus their time and effort on other unit priorities.

The Other Airmen: Low/No Code Milestone Two

The Other Airmen: Low/No Code Milestone Two

Citizen Developers participating in The Other Airmen low/no code experiment made excellent progress towards functioning minimum viable products (MVP) in December. While the second in-progress review (IPR) in December had three demos, the third IPR doubled that with six demos. 

The big rocks for the developers in December were clearing up lingering AFNet connection issues, expanding what capabilities they had available to them, and discovering APIs for data connectivity between their new applications and existing databases. The most successful capability demonstrated during the 12 January IPR was the seamless integration of DocuSign into an application for document package routing and signing. Other capabilities the developers are incorporating in their solutions include automatic account creation through CAC card registration and ranking data by confidence level. The breadth of use cases under development demonstrates the promise these platforms may hold for Airmen across the enterprise.

Several Citizen Developers are working on their solutions from home because of connection issues through AFNet. The ability to work on a solution from any location is important. We need to equip Airmen to work at their home station, while TDY, or even on deployment as long as they have an internet connection. We anticipate the barriers of going through AFNet will subside if low/no code is adopted as an enterprise approach for solution development.

As the teams pass the half-way mark of the experiment’s scheduled timespan, they transition from planning into development mode. The more mature solutions are now moving from “get it working” to “how can it be better” with more automation, a stronger user interface, or a broader range of database connectivity. The low/no code platforms and the Citizen Developers are exceeding expectations as they code future capabilities for their units.

The Other Airmen Milestone One

The Other Airmen Milestone One

November was a busy month for our Citizen Developers participating in The Other Airmen experiment. It was primarily a month of preparation and learning. Citizen Developers and commercial partners formed relationships, ironed out technical issues, scoped their use cases, and developed work plans.

Database structure, security, and data access were big rocks for most of the developers. From commercial pitch decks for an interactive marketplace to storage of PDFs requiring signatures, developers solicited advice from AF experts to determine how best to store and disseminate sensitive data.

Meanwhile, a few teams played catch-up on platform training after working through access issues. Kudos to the military IT teams and the individual developers that worked through the firewall and permissions challenges quickly at this early stage! The teams can now focus on completing their concept maps including database and relational ties.

The real surprise came from a few teams that hit the ground running. They brought their commercial partner a well-developed concept map and plan, sprinted through training, and leapfrogged the timeline with a minimum viable product (MVP) already in hand.

December promises some exciting progress. The second in-process review (IPR) happened December 8th with three demos: two working prototypes and a solid wireframe. The other teams were poised at the first IPR to move into the building phase of their projects, and they have continued moving forward. The experiment is going extremely well! Stay tuned for future progress updates on this Low Code/No Code experiment as we evaluate this capability for use enterprise-wide!

ARCHER: Collaboration for Excellence

ARCHER: Collaboration for Excellence

AF CyberWorx and the Extreme Digital Development Group Enterprise (EDDGE) teamed up to tackle Project ARCHER virtually despite the challenges of COVID-19. The magic of the dream UX team wove together active duty end users and creators of the original tool into a working group for an incredible experience.

This event was a great opportunity to experience what Human Centered Design is all about first-hand! EDDGE first reached out to AF CyberWorx to see how we could collaborate in an event, and they soon responded with an opportunity to empower software-enabled innovators.

The design process we went through really helped us understand the pain points and what part of the tool was making the process frustrating. We went through many exercises that allowed the team of end users, trainers, and developers understand the functionality that the ARCHER team wanted.

Within one week of being assembled as a team, we took off! We were able to power through the design phase with a clear understanding of the requirement. The EDDGE team couldn’t be more excited to work with Captain Chris “Archer” Fry and the CyberWorx team to get this product developed and out to help our people!

We look forward to continuing to provide our experienced software developers to sharp Airmen to come up with products that are USEABLE and maintainable. Virtual or in person, the collaboration with AF CyberWorx for Project ARCHER was a BLAST!

Virtual Sprint How-To Guide

Virtual Sprint How-To-Guide

By: Air Force CyberWorx UX Design Team

Executive Summary

The following is a collection of notes on various issues with respect to conducting remote or virtual sprint events. These notes are based on both experiential data and from reading dozens of articles on the subject. We do not advocate for any specific tools but provide reflections of our experiences with them. Our intent is to help you conduct a successful sprint on your own.

Since each sprint is different and people have many different ways of conducting a sprint, this document does not describe the sprint process. Instead, we describe the functional issues you’ll need to address in order to provide a successful virtual sprint.

This is not an exhaustive review of various tools and methods, just experiential knowledge cultivated over time. Therefore, this is a rough guide to use as a data point, not a comprehensive set of rules.

Main Objective

The most difficult aspect of conducting virtual events is maintaining participant engagement and momentum throughout the event. To that end, we provide these tips and tricks:

Changes to Our Design Sprints

Social distancing rules brought on by the COVID-19 pandemic forced us to develop a remote sprint capability to replace our in-person design sprint events. We quickly determined that we could not just cut and paste our renowned 3-day sprint process into a virtual environment. The virtual domain demanded we change the process, deliverables, and expectations according to the challenges presented by the participants and available technologies.

That said, we have had enough success with virtual sprints to consider this alternative when logistics (or acts of nature) prohibit us from bringing everyone into our studio.

All or None

If this works, what about combining virtual participants and in-person participants during an event? We wouldn’t recommend it. To put it simply, in-person participants will most likely end up dominating the conversation. Virtual participants can become frustrated and eventually become disengaged and go silent.

Participant engagement is a critical factor of a successful UX event. It’s a much more balanced level of engagement if everyone is either virtual or in-person, but not a mixture of both.

Synch vs. Asynch

An in-studio sprint is usually a 2-3 full day effort with lots of different group and breakout exercises. When everyone is in the studio, it’s easy to manage this kind of breakout and regroup process. In a virtual environment, this process is limited due to the technology and distractions in the participant’s workspace.

Distractions make it difficult to maintain attention on long conference calls. A virtual environment allows for a mixture of synchronous and asynchronous events. Synchronous events occur when everyone is together on the conference call for shorter periods of time. Asynchronous events are basically homework participants can complete alone or as a smaller team when timing (and lack of distraction) is best for them.

One good method is to describe and practice an exercise synchronously, such as creating storyboards or personas. Participants can then add to that body of knowledge by creating more of these artifacts asynchronously before the next session.

Pace Yourself

When we have 50 people fly in for an in-studio event, it makes sense to run the sprint over 2-3 days. In a virtual event, we can skip a day between events without losing momentum. Giving homework assignments on the off days keeps the participants engaged in their free time. Moreover, it allows more time for ideas and methods to sink in.

Attending a long conference call can be difficult for some. We recommend keeping the sessions to 2-3 hours and spread them out over several days. Separating the sessions by a day allows you to assign ‘homework,’ asking the participants to revisit the digital whiteboards and adding any additional insights that occurred to them. This approach proved useful in capturing insights that attendees did not have time to bring up during the session. This also helps to keep them engaged with the effort.

Taking Breaks

Since we ran 3-hour sprints, we found a single 15-minute break in the middle of the session to be good. When participants returned, we asked them to announce in (Zoom) chat that they were back from the break for accountability.

Video Conference Tools

If you have more than 6 participants for your event, we have found breakout rooms are a useful feature in a video conference tool for conducting remote sprints. You may want to send small teams into a breakout room to allow for more focused discussions and the generation of ideas. If your sprint expects to use breakout rooms, be sure to enable the breakout room features in the account settings (Zoom). You’ll know it is enabled in Zoom if you see the Breakout Room icon next to the Record button at the bottom of the screen. Different tools do this differently, just be sure to check that your tool has all of the features you’ll need enabled.

There is currently no consistency across the Air Force about the use of Zoom. Be prepared to switch to another tool. The Zoom Gov version is acceptable in some locations whereas a paid Zoom subscription is acceptable in others. The free version of Zoom is always questionable. In any case, avoid any sensitive discussions (FOUO, PII, etc.).

We tested only a few tools that offer breakout room capabilities:

Zoom – This is our top choice. Most folks know how to use it or need little to no training.

Blackboard – Less common, but an Air Force-accepted platform (licensed by AETC and used for academics at the Academy).

Blue Jeans – Pretty good functionality and similar to Zoom. Has a history of being unstable, but that may have improved over the years. This tool may not be approved for use on AF networks.

We didn’t test some tools for various reasons: time for testing, cost (free version insufficient), ease of use (based on reviews and discussions), software install required, and more.

Reviewed but not tested conference tools with breakout rooms:

NewRow – A remote classroom tool

Remo – A webinar tool

Use Phones for Audio

Recommend to the participants that they use their phone  to call in to the conference call. They can use the video conference tool to connect visually, but should not rely on the tool for their audio. Using a computer for both audio and video can double the internet bandwidth and create audio drop-outs. A video drop-out isn’t usually much of an issue, but audio drop-outs are disruptive.

To ensure their phone connection follows them into breakout rooms, participants should link their audio to their video persona. If they aren’t linked, the video goes to the breakout room, and the audio remains in the main room.

Standard Video Conference tools

As mentioned before, we highly recommend tools with breakout room capability. The difficulty of using tools without breakout room capability isn’t the technology, but the human factor. To mimic a breakout room, the host needs to create separate conference sessions for each team, send separate invites to the team members of each room ahead of time, tell them to log out of the current conference, and login to their specific room. You also have to make someone the moderator of each room. Procedural problems can arise if the chosen moderator doesn’t attend that session.

Each of the sprint moderators will have to login to each room separately to answer any questions or make announcements. On top of that, there is no easy way for members to ask questions of the facilitators. When the breakout session is done, participants have to repeat the process of quitting a room and login again to the main line.

This cumbersome process adds frustration to participants, interrupting their engagement and the sprint momentum. We found this takes extra time to reestablish engagement. Using standard video conference tools is acceptable for events that don’t require breakout teams.

Collaboration Tools

There are many collaboration tools, but Mural and Miro are the most common tools of many UX teams who offered suggestions during this research. Tools fall into three categories: digital whiteboards, prototype and design tools, and full design activities support. Since every sprint is different and every team has their unique ways of conducting sprints, we recommend that each team identify tools that serve their needs. Remember, this is a living document and you are encouraged to add you knowledge and experiences here.

Digital Whiteboard Tools

There are several digital whiteboard tools available, but the free versions limit the number of projects or team members. I hope you try them out and report back.

Some common examples are:

Everything Explained – Getting some attention from UXers

Stormboard – Trending with some UX teams

Prototype and Design Tools

These are common collaborative tools used for sharing and commenting on visual design concepts. As such, they are not optimized for supporting activities like task flows or journey maps.

Some common examples are:

InVision

Figma

Balsamiq

Sketch

Axure

Xd

Design activity support

This category is like a whiteboard, but with extra options that enable both design and non-design activities. We tested and used two tools successfully:

Mural

Mural offers a few more desirable facilitator functions, but it’s also a bit more difficult to learn how to use within the constraints of a virtual sprint.

Follow Me

This feature shows the moderator’s board on all of the participant screens so they can follow the facilitator’s work. A useful feature, but not used that often for sprints.

Miro

Easy to learn and use. It only takes a 15 minute practice session to get everyone up to speed. Miro lacks a few facilitator features that Mural offers, but it’s still quite good.

Bring to Me

Miro has added a new feature called Bring to Me that brings all or selected participants to the same area of users’ board. It is accessible from the member indicator circle in the upper right. Click on your circle and then select the desired option.

Miro Board invites

Team members can invite people to the Team, which is necessary if you want them to have access to one or more projects. Once someone is a member of the Team, they can be invited to any project or board. Team members invited to a board are, by default, given access to all boards in that project.

A recent update allows you to invite guest editors via a URL but be aware that there is no type of access protection with that link. Anyone with the link can access your board (and therefore the information on it). This sharing is performed through the share feature on a board (upper right). You create a link and then send it to the invited guest editors (email, slack, etc.).

Invited guests have access only those boards that you invite them to. They cannot see any other projects or boards.

To uninvite these guests, change the settings in the share dialog.

Through use, we created a best practice to streamline using Miro: keep a list of every invitee and their email address ready to resend them invites and double-check that each participant is invited to the Miro boards and conference tool (Zoom).

We settled on Miro for our sprints after testing both Mural and Miro with the AF CyberWorx staff.  Even better is that we already had a license for it, it made sense to use it.

VPN issues

We have discovered that Gov PC’s on a VPN are often blocked from accessing many tools. It is advisable to ask participants to turn off their VPNs or use their personal computers.

Facilitators

All facilitators, despite their role, need to have the right permissions in both the conference call and the collaboration tool to enable smooth transitions and quick answers to questions.

Pay No Attention to That Man Behind the Curtain

Because of the various technology requirements, it is best to have someone on the periphery to take command of the conference rooms and collaboration tools as a sort of producer or Wizard of Oz genius behind the curtain. This enables the moderator(s) to focus on activities without being distracted by technology issues. The more participants in a sprint, the more this becomes and issue. Typically, a lead facilitator and assistant can handle up to 15 participants, but any more than that requires the additional assistance of a Wizard of Oz.

Set up Breakout Rooms Ahead of Time

It is better to work with the project stakeholder to identify the team make up. Avoid overloading teams with participants who share the same perspective or role. Assign attending participants based on these predefined teams. Reevaluate these teams during the sprint as some folks may need to drop off and may not be available during the time allocated for the breakouts. This is one of the tasks the ‘producer’ needs to perform behind the scenes so that the facilitators don’t have to interrupt the sprint.

Moderators

Moderators need to be assigned as co-hosts. Each breakout room should have a moderator assigned and each moderator must be assigned to a team, otherwise, due to technical issues, they will not be able to bounce around to other rooms.

One Board or Separate Boards?

For breakout sessions, it may be necessary to provide a separate private board available only to the members of that specific team. Separate boards reduce distractions and confusion over where on the collaboration screen each participant should focus.

Technology

We have yet to test these processes on a tablet but we recommend participants use a laptop or desktop computer with sufficient processing and graphics capabilities.  In some cases, depending on the collaboration tools used, personnel will need to use a personal device on a commercial network while others may be able to use government devices on a government network. Highly recommend testing out the tools prior to start with enough time to work any technical issues.

Two Monitors

If possible, it is advisable to use two screens, one for the video conference tool (Zoom) and one for the collaboration tool (Miro). There are times when the moderator will be sharing a screen on Zoom and participants will be interacting with their collaboration boards.

Varying Degrees of Internet Access

Recognizing that different sites have different internet access rules and not every tool can be used on a government computer or behind a firewall, it makes sense to test each site that a participant would use to make sure they can access and use each of the tools. For instance, not everyone may have access that allows them to use the collaboration tools (Miro and. Mural). Therefore, you may need to simply share your screen in the video conferencing tools. Test this far enough in advance to allow time to make adjustments to the plan.

Pre-Sprint Checklist

Create visually large anchor points in the main project board (Miro) that are easy to find. This makes it easy to direct participants to the right area of a board (which can get pretty large). A large numbered circle is highly visible from the navigation map.

Practice Board

A practice board (Miro or Mural) lets invitees login to the board and use some of the common features prior to the event. Let them know you will be monitoring that board to make sure everyone can log in to it. Have attendees leave a “Kilroy Was Here” message on the board so you can track who was able to login. If someone doesn’t leave a message on the board, be sure to reach out to ask why before the start of the sprint.

We developed practice boards for each feature we expected users to use during the sprint. We provided a sample artifact using each tool for users to recreate on their own. For instance, in one frame we showed some colored shapes and had users recreate those. In another frame, we had users connect shapes with the arrow connection tool.

Uploading Images

In some cases, you may want the participants to draw things. Not everyone is comfortable drawing on a digital whiteboard. Therefore, be prepared to let participants draw things on paper with marker pens and upload a picture of it to the board. To facilitate this, you should have participants practice taking a picture and uploading it.

This also means that you should let participants know to have paper and markers on hand. Regular pens and pencils don’t register well enough when photographed and uploaded. Include this information in the invitation email.

Preferred Email Address

Be sure to ask for their preferred email address, not just their official address.  Military email can have connection and delivery issues that would inhibit participants from getting essential sprint-related emails in a timely manner.

Test, Test, Test

Be sure to test all of your tools and features prior to launching your sprint. Enlist your colleagues to participate in a dry run of your process and tools. Something will need adjusting, so plan for about an hour to do a full dry-run.

In-Sprint Checklist

Introduce the collaboration tool (Miro) and review the practice steps to show how it should be done. This helps those who didn’t practice and those who struggled with the tool to better understand how the tool works.

Be sure to demonstrate how to navigate the boards using the map tool and the different pointers.

When it comes time to vote on something, have a PowerPoint slide ready to describe how to use the voting feature and show an example of what it looks like to the users. Indicate what to click on to place their vote. For instance, in Miro, if they click on an object in a drawing, that object will get the vote, not the drawing. This can be leveraged to promote some conversation by asking users to clarify what they were voting for.

Session Recordings

Be sure to record the meetings. It’s better to store the recordings on the local computer to avoid running out of space in the cloud. Zoom offers 1GB of cloud storage with a paid subscription, but that fills up in 4-8 hours.

Because Zoom only records what is shown in the Zoom screen, be sure to have the moderator share a screen with the digital whiteboard tool displayed in it.

Remind participants to call in on their phones and link their phone to their video feed. It may be best to describe how to do this as part of your welcome message. The Producer Behind the Curtain can tell who is linked and who isn’t and prompt them via direct chat.

Post-Sprint Checklist

Recommend capturing the session recording and make it available to the necessary parties.

Take advantage of the tool to  capture relevant info on the boards. Miro has an export board function to save parts to your computer as an image or vector PDF. The vector PDF allows the greatest clarity and zooming features for later use.

Create a survey asking folks what they would like to see to improve their experience.

In Summary

While conducting virtual sprints was a direct response to the limitations imposed by the Covid-19 lockdown, we learned that virtual sprints may be useful when gathering a group in our studio is not feasible. Hopefully, you will find this information useful when you need to conduct a remote or virtual UX event.

Remember, this is a living document that will benefit from your experiences – both successful and unsuccessful. Feel free to add comments or questions.

ARTIFICIAL OR AUGMENTED INTELLIGENCE: WHICH SOLUTION IS RIGHT FOR YOUR PROJECT?

Artificial Intelligence (AI) is the latest ‘gadget’ that companies want to add to every project. However, it’s not the silver bullet that many folks hope it to be. It has its strengths and limitations. To overcome AI’s limitations, Augmented Intelligence optimizes the strengths of Artificial Intelligence, Machine Learning (ML), and human capabilities.

The fundamental limitation of AI is not the programming, but the design of the AI algorithms. Artificial Intelligence is a rules-based solution, and a traditional rules-based AI requires complete and accurate rules. It’s up to the designers to accurately define the right rules. Historically, humans are not very good at predicting every eventuality or possibility, and thus do a poor job of specifying the rules.

And then there’s the issue of the data used to train and operate an AI system.

Humans routinely arrive at successful solutions based on ambiguous, inaccurate, and incomplete data. Computer and AI systems require accurate data to derive an accurate solution. Unfortunately, inaccurate data is a common occurrence, and it’s not always possible to identify the inaccuracies. For instance, there may be inherent biases in the data that will alter the results. These biases may be created by virtue of how the data was collected or due to an unrepresentative data set.

Accurate data is critical, especially if the tool includes a Machine Learning component. A learning engine is only as good as the data it learns from. A good AI solution should include a learning engine to constantly evolve its rules based on actual usage.

Creating a training data set requires a lot of human processing, which may account for much of the unintended biases in the data set. A human decides what data to use to train a learning engine. Without knowledge of how the training algorithm works, they may exclude data that would otherwise be useful and include data that confuses the learning engine.

Besides data accuracy issues, AI systems are not designed to understand inferences as well as humans. An AI engine might be good at taking ice cream orders, such as a chocolate ice cream cone with candy chunks on it, but the same engine would not know how to handle a request for a cone, “Just like that one only with sprinkles.”

AI systems are designed and trained to perform specific functions with specific data sets. They are not yet sophisticated enough to generalize across all inputs to learn everything. Humans are uniquely capable of transferring knowledge across domains to operate effectively in a novel environment based on what we have learned in other, non-related environments.

Humans have limitations, as well. Humans just don’t have memory processing capabilities as good as computers. We are not able to process large amounts of data, we cannot keep track of many things at one time, and we cannot recall things with perfect accuracy. These human limitations are AI strengths.

So, one of the big questions is how to leverage the strengths of humans to overcome the weaknesses of AI, and vice-versa. Augmented Intelligence is a model that emphasizes the assistive role AI can have in enhancing human cognition rather than trying to mimic or replace it. A good example of the use of Augmented Intelligence is in the Automated Readiness Forecasting (ARF) tool designed by the AF CyberWorx team.

ARF accepts mission parameters, such as an exercise several months away, and schedules qualification events to ensure that appropriate personnel and resources are available for the exercise. It also keeps track of all necessary equipment maintenance as well as personnel training and medical activities needed prior to the exercise. It suggests schedule changes which will ensure the required resources are ready in time. Affected personnel only have to approve or acknowledge any adjustments. The tool then tracks progress, highlighting any deviations to the plan and suggesting corrections along the way to stay on track. Machine Learning uses the repeated iterations and changes to evolve the scheduling algorithm, fine-tuning its capabilities and reducing reliance on human intervention.

The readiness tool assumes the mundane tasks humans would otherwise have to perform to keep track of readiness data and immediately adjusts the plan to accommodate any changes to resource availability. Normally, such adjustments would require dozens of people working hours to accomplish. Humans are still required to approve changes, but are relieved of the time-consuming and error-prone tasks associated with adjusting schedules manually. This tracking and automation saves hundreds of resource hours, eliminates errors, and ensures a higher level of readiness.

This example illustrates an Augmented Intelligence and Machine Learning paradigm that could be applied to many Air Force projects rather than a typical Artificial Intelligence approach. The point is, AI is not a panacea for all problems. AI is successful when the problem domain is well understood, the data set is accurate, and the AI is focused on a specific task. Knowing when to design for an Augmented Intelligence verses an Artificial Intelligence – and the difference between the two – is crucial to success. Not every problem needs an AI solution.

*The postings on this blog reflect individual team member opinions and do not necessarily reflect official Air Force positions, strategies, or opinions.

PART FIVE: AFTER THE SPRINT: NEXT STEPS

Dissecting the Design Sprint Event

PART FIVE: AFTER THE SPRINT: NEXT STEPS

A design sprint’s ultimate goal is to improve a situation, whether that means improving an existing process or developing a new product. Not acting on the solutions a team suggests means a design event has not fulfilled its purpose. AF CyberWorx continues working beyond the sprint itself to help the results of a design sprint reach implementation.

The last step of the event is the outbrief where participants present their findings to the stakeholders. With team suggestions in mind, AF CyberWorx uses their own crack team to work with stakeholders to determine the next steps.

Greg Bennett, AF CyberWorx Relationship Manager, explains, “Every single sprint is going to be different. The outcomes are going to be different. The resources required on the back end are going to be different.” As such, AF CyberWorx customizes a roadmap to the end result according to the needs, limitations, and available resources of the problem owner. “We will continue to stay engaged as they want us to stay engaged,” Greg reassures. However, “We will not fight the fight for them.” Even with CyberWorx assistance, “Transition will not happen without [the champion’s] direct involvement and advocacy.”

Each champion and problem owner has different levels of capability and needs. As such, AF CyberWorx has different tools at their disposal to help. While some projects proceed best after transitioning to a different Air Force agency (which is an option), others are best served by going through a research and contracting process to get industry assistance. AF CyberWorx acts as a bridge between the problem owner and the contracting office to assist with that.

Once the problem owner and AF CyberWorx decide to pursue contracting options, Erica Wilson, Contracting Officer, and Casey Pehrson, Contracting Specialist, go to work. Erica explains that with AF CyberWorx acting as their technical point of contact, “we connect market research to figure out if anybody’s doing what they’re wanting done, who’s doing it, and what types of businesses are doing it. From there, we decide on a contracting vehicle.” The contracting office looks for the most streamlined avenue based on available information, resources, and the problem owner’s timeline. To best do that, as Casey says, “We just need to come together as a government team and find the best way forward.”

The problem owner can help that process most by clearly defining their wants and needs: what they need, what they’re trying to buy or do, and what their limitations are. Information from the design sprint really helps with this, but the more specific the project parameters, the better the contracting office can find the best paths forward.

“Transition [to the end result] requires continued involvement with the customer,” Greg stresses, “to formulate strategies…and advocate with leadership, program offices, and sustainment functions.” It takes a dedicated team to champion a solution and transition it from the idea stage to implementation. From the discovery call laying the groundwork through the problem solving process to drawing out a roadmap to implementation, AF CyberWorx helps guide the process by connecting the right people at the right time. While we don’t do the fighting, we coach problem owners as needed to refine and mitigate Air Force challenges.

*The postings on this blog reflect individual team member opinions and do not necessarily reflect official Air Force positions, strategies, or opinions.

STEP FOUR: THE OUTBRIEF

Dissecting the Design Sprint Event

STEP FOUR: THE OUTBRIEF

The finish line of a design sprint is the final outbrief. This last piece is when participants and decision makers see the results of the hard work everyone has put into finding a solution. The outbrief is an integration of everyone’s efforts during the event and includes all the design sprint elements: the refined problem statement, personas and scenarios, solution design, and all the pieces in between.

Vel Preston, AF CyberWorx Head of Innovation Design, describes the elements of an outbrief. Just as the event begins with the problem statement, the outbrief also starts with the problem. As Vel asks, “What’s the impact of the status quo?”

Stakeholders define the goal, but it’s the participants who explore the problem and determine how the status quo impacts the end users and, by extension, the mission. While the military mindset tends to put mission first, people enable the mission. As Vel explains, “We’re in the military. A lot of our problems revolve around lives at stake,” whether that means boots on the ground being supported by aircraft in the sky or personnel relying on finance for their paychecks. In exploring the human aspects of the problem, participants identify specific pain points to improve upon.

After briefers reiterate the problem, explain its impact, and outline the human element involved, they’re ready to lay out their solution. The ideal solution addresses the problem statement directly and pertains to the specific barriers and pain points identified during the design sprint journey. Vel explains that overall, “we want our listeners to follow the logic trail…Here’s what the status quo is, here’s the impact of keeping it that way, here are the things that’re getting in the way [of improving], and here’s how we can do it better.” The solution is the final piece that gives stakeholders a direction to a better future.

For the best response to an outbrief, participants should keep in mind more than just what the problem and potential solution is. Participants need to know to whom they are briefing. Ideally, they will be the person or people who can say yea or nay to the next steps. When that’s not possible, the person receiving the information should be someone who can become a champion for the solution to those who do have say. To help with this, briefers need to understand what information the decision-makers need. The team needs to consider and include that information. Some of that can include approximate costs, necessary resources, items to investigate further, and what parts of a solution are already in place to easily use.

Though the purpose of a design sprint is to come up with new ideas and solutions to solve or mitigate a problem, there are still limits to keep in mind. As Vel says, “It’s harder to get behind someone who wants to restructure everything.” Resources across the DoD are shared among a lot of projects, programs, and departments. If a team attempts to change the world, even if that is what is ultimately needed, their solution may not gain much traction.

Vel explains that the best received solutions are when the participants have drilled down to a single root cause that affects several pain points of the problem. “Because [a team] focused on the root cause…the integrated solution was more impactful and powerful.” If they focus on the right problem, the solution will have a large impact and be received well by stakeholders even if the root cause is relatively small and can be fixed with little effort and resources.

The outbrief is not the end, however. As Vel encourages participants, “The outbrief should be the beginning of change that everyone is asking for.” Viewing the outbrief as the beginning of change, instead of the end of an event, changes the perspective from one of presentation and after action to one of suggesting next steps and path to improvement. AF CyberWorx works to enable change and improvement. Facilitating events and guiding experts through the process to a solution is simply a vehicle to enable that change, not an end in itself.

*The postings on this blog reflect individual team member opinions and do not necessarily reflect official Air Force positions, strategies, or opinions.