Team Comments – Q3

Team Comments – Q3

What does an AF CyberWorx “win” look like? As we touched on in previous newsletters, we use our unique blend of user experience and modern problem solving/product development methods to tackle a wide range of operational and organizational challenges.  The solutions can range from a white-paper strategy/policy recommendation, re-engineered business processes, to technology (software/hardware) solutions or a combination of the three. 

We had three major wins in quarter three representing how we work to support Air Force solutions:

  •  Early Warning Radar Sustainment: As we supported Air Combat Command’s legacy radar sustainment effort, we not only assisted operators, maintainers, and sustainers in bringing together different viewpoints of the challenge, we combined a common future vision with acquisition strategies to engage industry to achieve the optimal solution for the Air Force. 
  • Six Degrees of Kevin Beacon: In our work with the Air Force Rescue Coordination Center, we helped them develop the concept for process flows and a software tool to streamline identification and response tracking to emergency beacons across the continental United States.  Additionally, we helped prepare them to submit for Small Business Innovation Research (SBIR) funds via AFWERX/AFRL, bringing together multiple partners in the AF innovation ecosystem and multiply AFRCC’s initial development investment. 
  • OPTIMIS: The originally airlift-focused flight evaluation app developed in a project originally taken up by Academy cadets took a different turn as we recognized complementary flight scheduling efforts under way by Kessel Run, another partner in the AF innovation ecosystem.  We transitioned our user-tested, minimum viable product into their portfolio to broaden its scope across all mission platforms, allow for full mission support integration, and provide for out-year sustainment. 

What all our solutions have in common is that through our end-user focused processes, we re-imagine how people, organizations, and technology interact to best accomplish and support the Air Force mission.  When we help with solutions, we don’t just stop with an idea or concept to solve a problem. We support with strategies and coordination to identify and engage partners for development and transition to sustainment. 

Successful outcomes from AF CyberWorx engagements are as diverse as the challenges they answer but always focus on the end-user to maximize mission impact and bring to bear our full range of academic, industry, and government partnerships to maximize transition potential. Resolve, Accelerate, Deliver! What would an AF CyberWorx-powered win look like for your organization?

Good News – OPTIMIS – Q3

Good News – OPTIMIS – Q3

As Lt Col Helgeson stated above, we’ve had multiple wins the last quarter. OPTIMIS is but one example and shows what we love to do: take a previously unaddressed end-user need, develop a prototype, and transition it to maturity and full sustainment.

In March 2018, the 21st Airlift Squadron (AS), Travis Air Force Base, California, reached out for help updating their home-grown pilot training and evaluation system, OPTIMIS. The unit built their database in MS Access years before and the developer had long since departed, leaving an unsupported system. Add to that their desire for more functionality and compatibility with the GTIMS system, and they were ready for some help.

Between November 2018 and November 2019, Academy cadets answered the call for help, using the real-world improvement project for a capstone course. As they worked on the improved app, they developed their skills in teamwork, user design fundamentals, and project management. By the final presentations, the cadets had created a mid-quality prototype that had already gone through the first phase of user testing at Travis AFB. Their time had come to graduate, however, so AF CyberWorx took the project and continued to code for transition.

AF CyberWorx carried OPTIMIS through more user testing, further refined the design, and found other programs following similar objectives. Kessel Run was already working on other mission planning and support applications, so we worked to transition a refined prototype to be included in their suite of matured applications. Recently, our attempts bore fruit as OPTIMIS transitioned for out-year sustainment. That’s our favorite part of a “win”: seeing the needs of end users being addressed as a solution moves from an idea through prototyping and testing to sustainment and implementation. Here’s to many more wins in the future.

ARCHER: Collaboration for Excellence

ARCHER: Collaboration for Excellence

AF CyberWorx and the Extreme Digital Development Group Enterprise (EDDGE) teamed up to tackle Project ARCHER virtually despite the challenges of COVID-19. The magic of the dream UX team wove together active duty end users and creators of the original tool into a working group for an incredible experience.

This event was a great opportunity to experience what Human Centered Design is all about first-hand! EDDGE first reached out to AF CyberWorx to see how we could collaborate in an event, and they soon responded with an opportunity to empower software-enabled innovators.

The design process we went through really helped us understand the pain points and what part of the tool was making the process frustrating. We went through many exercises that allowed the team of end users, trainers, and developers understand the functionality that the ARCHER team wanted.

Within one week of being assembled as a team, we took off! We were able to power through the design phase with a clear understanding of the requirement. The EDDGE team couldn’t be more excited to work with Captain Chris “Archer” Fry and the CyberWorx team to get this product developed and out to help our people!

We look forward to continuing to provide our experienced software developers to sharp Airmen to come up with products that are USEABLE and maintainable. Virtual or in person, the collaboration with AF CyberWorx for Project ARCHER was a BLAST!

Q2 Newsletter

Team Comments

AF organizations need help solving big problems in more fields than just cyber. We’ve accelerated our pace and expanded our scope to meet the demand utilizing our seasoned user experience (UX) team and proven research methods. 

Putting the user first creates the best solutions for policy, operational, and organizational challenges enabling emerging warfare constructs and addressing warfighter resiliency. Our methods work for all areas of expertise, designing minimum viable products wherever they’re needed.

We work closely with problem owners and stakeholders to flow smoothly to development. Including partners from the start ensures everyone understands how to achieve successful implementation. Some paths include Other Transaction Authority, Commercial Solutions Offerings, or Small Business Innovation Research submissions. Our partners, including the 309 SXMG, SkiCAMP, EDDGE, and the CDO’s office have helped successfully transition projects to development.

We’re helping our customers with a broad range of projects including:

  • Air Force JADC2 test and exercise platform at Nellis AFB’s Shadow Operations Center
  • Kubernetes UI on a T-38 enabling the AF to land an aircraft with better software than it took off with
  • A common mission planning tool for the F-35 community
  • New approaches to resiliency at USAFA, helping medical and support organizations

We hosted thirteen government and industry specialists in a Business Intelligence (BizInt) design sprint 7-9 January 2020. BizInt was the first project in a working relationship with Air University Blue Horizons and focused on providing commanders with up-to-date logistics and contracting information. General David L. Goldfein, Chief of Staff of the Air Force, reacted positively to the BizInt app during a demonstration held on 14 May. Blue Horizons will demonstrate the tool to A4 and J4 soon for further leadership awareness. The future bright for this new capability.

Project Archer – We teamed up with Extreme Digital Development Group Enterprise iterate on the home-grown ARCHER program for analyst exercise scenario creation and data collection.

Assessor/AEISS – NORAD and NORTHCOM personnel, industry partners, and our team explored how to improve the User-Interface for AEISS to speed appropriate AF threat response.

6 Degrees of Kevin Beacon – Government and industry specialists examined requirements for a new cloud-based mission management tool to allow quicker, more accurate responses to civil search and rescue.

616 OC Convergence – Government participants explored how intelligence warfare could improve situational awareness and understanding to increase effectiveness.

Early Warning Radar Sustainment– Air Force personnel and industry participants came together to discover novel ways to sustain early warning radar capabilities for the next few decades.

CSfC – Government and industry specialists will explore the capabilities of Commercial Solutions for Classified Use in a discovery forum.

Kubernetes – How could the AF land an aircraft with better software than it took off with? AF CyberWorx is improving the UI/UX for a Kubernetes-based software platform on a T-38.

USAFA Strong – We are facilitating a project initiated by cadets and newly commissioned 2Lts focused on improving the mental health system at USAFA.

Virtual Sprint How-To Guide

Virtual Sprint How-To-Guide

By: Air Force CyberWorx UX Design Team

Executive Summary

The following is a collection of notes on various issues with respect to conducting remote or virtual sprint events. These notes are based on both experiential data and from reading dozens of articles on the subject. We do not advocate for any specific tools but provide reflections of our experiences with them. Our intent is to help you conduct a successful sprint on your own.

Since each sprint is different and people have many different ways of conducting a sprint, this document does not describe the sprint process. Instead, we describe the functional issues you’ll need to address in order to provide a successful virtual sprint.

This is not an exhaustive review of various tools and methods, just experiential knowledge cultivated over time. Therefore, this is a rough guide to use as a data point, not a comprehensive set of rules.

Main Objective

The most difficult aspect of conducting virtual events is maintaining participant engagement and momentum throughout the event. To that end, we provide these tips and tricks:

Changes to Our Design Sprints

Social distancing rules brought on by the COVID-19 pandemic forced us to develop a remote sprint capability to replace our in-person design sprint events. We quickly determined that we could not just cut and paste our renowned 3-day sprint process into a virtual environment. The virtual domain demanded we change the process, deliverables, and expectations according to the challenges presented by the participants and available technologies.

That said, we have had enough success with virtual sprints to consider this alternative when logistics (or acts of nature) prohibit us from bringing everyone into our studio.

All or None

If this works, what about combining virtual participants and in-person participants during an event? We wouldn’t recommend it. To put it simply, in-person participants will most likely end up dominating the conversation. Virtual participants can become frustrated and eventually become disengaged and go silent.

Participant engagement is a critical factor of a successful UX event. It’s a much more balanced level of engagement if everyone is either virtual or in-person, but not a mixture of both.

Synch vs. Asynch

An in-studio sprint is usually a 2-3 full day effort with lots of different group and breakout exercises. When everyone is in the studio, it’s easy to manage this kind of breakout and regroup process. In a virtual environment, this process is limited due to the technology and distractions in the participant’s workspace.

Distractions make it difficult to maintain attention on long conference calls. A virtual environment allows for a mixture of synchronous and asynchronous events. Synchronous events occur when everyone is together on the conference call for shorter periods of time. Asynchronous events are basically homework participants can complete alone or as a smaller team when timing (and lack of distraction) is best for them.

One good method is to describe and practice an exercise synchronously, such as creating storyboards or personas. Participants can then add to that body of knowledge by creating more of these artifacts asynchronously before the next session.

Pace Yourself

When we have 50 people fly in for an in-studio event, it makes sense to run the sprint over 2-3 days. In a virtual event, we can skip a day between events without losing momentum. Giving homework assignments on the off days keeps the participants engaged in their free time. Moreover, it allows more time for ideas and methods to sink in.

Attending a long conference call can be difficult for some. We recommend keeping the sessions to 2-3 hours and spread them out over several days. Separating the sessions by a day allows you to assign ‘homework,’ asking the participants to revisit the digital whiteboards and adding any additional insights that occurred to them. This approach proved useful in capturing insights that attendees did not have time to bring up during the session. This also helps to keep them engaged with the effort.

Taking Breaks

Since we ran 3-hour sprints, we found a single 15-minute break in the middle of the session to be good. When participants returned, we asked them to announce in (Zoom) chat that they were back from the break for accountability.

Video Conference Tools

If you have more than 6 participants for your event, we have found breakout rooms are a useful feature in a video conference tool for conducting remote sprints. You may want to send small teams into a breakout room to allow for more focused discussions and the generation of ideas. If your sprint expects to use breakout rooms, be sure to enable the breakout room features in the account settings (Zoom). You’ll know it is enabled in Zoom if you see the Breakout Room icon next to the Record button at the bottom of the screen. Different tools do this differently, just be sure to check that your tool has all of the features you’ll need enabled.

There is currently no consistency across the Air Force about the use of Zoom. Be prepared to switch to another tool. The Zoom Gov version is acceptable in some locations whereas a paid Zoom subscription is acceptable in others. The free version of Zoom is always questionable. In any case, avoid any sensitive discussions (FOUO, PII, etc.).

We tested only a few tools that offer breakout room capabilities:

Zoom – This is our top choice. Most folks know how to use it or need little to no training.

Blackboard – Less common, but an Air Force-accepted platform (licensed by AETC and used for academics at the Academy).

Blue Jeans – Pretty good functionality and similar to Zoom. Has a history of being unstable, but that may have improved over the years. This tool may not be approved for use on AF networks.

We didn’t test some tools for various reasons: time for testing, cost (free version insufficient), ease of use (based on reviews and discussions), software install required, and more.

Reviewed but not tested conference tools with breakout rooms:

NewRow – A remote classroom tool

Remo – A webinar tool

Use Phones for Audio

Recommend to the participants that they use their phone  to call in to the conference call. They can use the video conference tool to connect visually, but should not rely on the tool for their audio. Using a computer for both audio and video can double the internet bandwidth and create audio drop-outs. A video drop-out isn’t usually much of an issue, but audio drop-outs are disruptive.

To ensure their phone connection follows them into breakout rooms, participants should link their audio to their video persona. If they aren’t linked, the video goes to the breakout room, and the audio remains in the main room.

Standard Video Conference tools

As mentioned before, we highly recommend tools with breakout room capability. The difficulty of using tools without breakout room capability isn’t the technology, but the human factor. To mimic a breakout room, the host needs to create separate conference sessions for each team, send separate invites to the team members of each room ahead of time, tell them to log out of the current conference, and login to their specific room. You also have to make someone the moderator of each room. Procedural problems can arise if the chosen moderator doesn’t attend that session.

Each of the sprint moderators will have to login to each room separately to answer any questions or make announcements. On top of that, there is no easy way for members to ask questions of the facilitators. When the breakout session is done, participants have to repeat the process of quitting a room and login again to the main line.

This cumbersome process adds frustration to participants, interrupting their engagement and the sprint momentum. We found this takes extra time to reestablish engagement. Using standard video conference tools is acceptable for events that don’t require breakout teams.

Collaboration Tools

There are many collaboration tools, but Mural and Miro are the most common tools of many UX teams who offered suggestions during this research. Tools fall into three categories: digital whiteboards, prototype and design tools, and full design activities support. Since every sprint is different and every team has their unique ways of conducting sprints, we recommend that each team identify tools that serve their needs. Remember, this is a living document and you are encouraged to add you knowledge and experiences here.

Digital Whiteboard Tools

There are several digital whiteboard tools available, but the free versions limit the number of projects or team members. I hope you try them out and report back.

Some common examples are:

Everything Explained – Getting some attention from UXers

Stormboard – Trending with some UX teams

Prototype and Design Tools

These are common collaborative tools used for sharing and commenting on visual design concepts. As such, they are not optimized for supporting activities like task flows or journey maps.

Some common examples are:

InVision

Figma

Balsamiq

Sketch

Axure

Xd

Design activity support

This category is like a whiteboard, but with extra options that enable both design and non-design activities. We tested and used two tools successfully:

Mural

Mural offers a few more desirable facilitator functions, but it’s also a bit more difficult to learn how to use within the constraints of a virtual sprint.

Follow Me

This feature shows the moderator’s board on all of the participant screens so they can follow the facilitator’s work. A useful feature, but not used that often for sprints.

Miro

Easy to learn and use. It only takes a 15 minute practice session to get everyone up to speed. Miro lacks a few facilitator features that Mural offers, but it’s still quite good.

Bring to Me

Miro has added a new feature called Bring to Me that brings all or selected participants to the same area of users’ board. It is accessible from the member indicator circle in the upper right. Click on your circle and then select the desired option.

Miro Board invites

Team members can invite people to the Team, which is necessary if you want them to have access to one or more projects. Once someone is a member of the Team, they can be invited to any project or board. Team members invited to a board are, by default, given access to all boards in that project.

A recent update allows you to invite guest editors via a URL but be aware that there is no type of access protection with that link. Anyone with the link can access your board (and therefore the information on it). This sharing is performed through the share feature on a board (upper right). You create a link and then send it to the invited guest editors (email, slack, etc.).

Invited guests have access only those boards that you invite them to. They cannot see any other projects or boards.

To uninvite these guests, change the settings in the share dialog.

Through use, we created a best practice to streamline using Miro: keep a list of every invitee and their email address ready to resend them invites and double-check that each participant is invited to the Miro boards and conference tool (Zoom).

We settled on Miro for our sprints after testing both Mural and Miro with the AF CyberWorx staff.  Even better is that we already had a license for it, it made sense to use it.

VPN issues

We have discovered that Gov PC’s on a VPN are often blocked from accessing many tools. It is advisable to ask participants to turn off their VPNs or use their personal computers.

Facilitators

All facilitators, despite their role, need to have the right permissions in both the conference call and the collaboration tool to enable smooth transitions and quick answers to questions.

Pay No Attention to That Man Behind the Curtain

Because of the various technology requirements, it is best to have someone on the periphery to take command of the conference rooms and collaboration tools as a sort of producer or Wizard of Oz genius behind the curtain. This enables the moderator(s) to focus on activities without being distracted by technology issues. The more participants in a sprint, the more this becomes and issue. Typically, a lead facilitator and assistant can handle up to 15 participants, but any more than that requires the additional assistance of a Wizard of Oz.

Set up Breakout Rooms Ahead of Time

It is better to work with the project stakeholder to identify the team make up. Avoid overloading teams with participants who share the same perspective or role. Assign attending participants based on these predefined teams. Reevaluate these teams during the sprint as some folks may need to drop off and may not be available during the time allocated for the breakouts. This is one of the tasks the ‘producer’ needs to perform behind the scenes so that the facilitators don’t have to interrupt the sprint.

Moderators

Moderators need to be assigned as co-hosts. Each breakout room should have a moderator assigned and each moderator must be assigned to a team, otherwise, due to technical issues, they will not be able to bounce around to other rooms.

One Board or Separate Boards?

For breakout sessions, it may be necessary to provide a separate private board available only to the members of that specific team. Separate boards reduce distractions and confusion over where on the collaboration screen each participant should focus.

Technology

We have yet to test these processes on a tablet but we recommend participants use a laptop or desktop computer with sufficient processing and graphics capabilities.  In some cases, depending on the collaboration tools used, personnel will need to use a personal device on a commercial network while others may be able to use government devices on a government network. Highly recommend testing out the tools prior to start with enough time to work any technical issues.

Two Monitors

If possible, it is advisable to use two screens, one for the video conference tool (Zoom) and one for the collaboration tool (Miro). There are times when the moderator will be sharing a screen on Zoom and participants will be interacting with their collaboration boards.

Varying Degrees of Internet Access

Recognizing that different sites have different internet access rules and not every tool can be used on a government computer or behind a firewall, it makes sense to test each site that a participant would use to make sure they can access and use each of the tools. For instance, not everyone may have access that allows them to use the collaboration tools (Miro and. Mural). Therefore, you may need to simply share your screen in the video conferencing tools. Test this far enough in advance to allow time to make adjustments to the plan.

Pre-Sprint Checklist

Create visually large anchor points in the main project board (Miro) that are easy to find. This makes it easy to direct participants to the right area of a board (which can get pretty large). A large numbered circle is highly visible from the navigation map.

Practice Board

A practice board (Miro or Mural) lets invitees login to the board and use some of the common features prior to the event. Let them know you will be monitoring that board to make sure everyone can log in to it. Have attendees leave a “Kilroy Was Here” message on the board so you can track who was able to login. If someone doesn’t leave a message on the board, be sure to reach out to ask why before the start of the sprint.

We developed practice boards for each feature we expected users to use during the sprint. We provided a sample artifact using each tool for users to recreate on their own. For instance, in one frame we showed some colored shapes and had users recreate those. In another frame, we had users connect shapes with the arrow connection tool.

Uploading Images

In some cases, you may want the participants to draw things. Not everyone is comfortable drawing on a digital whiteboard. Therefore, be prepared to let participants draw things on paper with marker pens and upload a picture of it to the board. To facilitate this, you should have participants practice taking a picture and uploading it.

This also means that you should let participants know to have paper and markers on hand. Regular pens and pencils don’t register well enough when photographed and uploaded. Include this information in the invitation email.

Preferred Email Address

Be sure to ask for their preferred email address, not just their official address.  Military email can have connection and delivery issues that would inhibit participants from getting essential sprint-related emails in a timely manner.

Test, Test, Test

Be sure to test all of your tools and features prior to launching your sprint. Enlist your colleagues to participate in a dry run of your process and tools. Something will need adjusting, so plan for about an hour to do a full dry-run.

In-Sprint Checklist

Introduce the collaboration tool (Miro) and review the practice steps to show how it should be done. This helps those who didn’t practice and those who struggled with the tool to better understand how the tool works.

Be sure to demonstrate how to navigate the boards using the map tool and the different pointers.

When it comes time to vote on something, have a PowerPoint slide ready to describe how to use the voting feature and show an example of what it looks like to the users. Indicate what to click on to place their vote. For instance, in Miro, if they click on an object in a drawing, that object will get the vote, not the drawing. This can be leveraged to promote some conversation by asking users to clarify what they were voting for.

Session Recordings

Be sure to record the meetings. It’s better to store the recordings on the local computer to avoid running out of space in the cloud. Zoom offers 1GB of cloud storage with a paid subscription, but that fills up in 4-8 hours.

Because Zoom only records what is shown in the Zoom screen, be sure to have the moderator share a screen with the digital whiteboard tool displayed in it.

Remind participants to call in on their phones and link their phone to their video feed. It may be best to describe how to do this as part of your welcome message. The Producer Behind the Curtain can tell who is linked and who isn’t and prompt them via direct chat.

Post-Sprint Checklist

Recommend capturing the session recording and make it available to the necessary parties.

Take advantage of the tool to  capture relevant info on the boards. Miro has an export board function to save parts to your computer as an image or vector PDF. The vector PDF allows the greatest clarity and zooming features for later use.

Create a survey asking folks what they would like to see to improve their experience.

In Summary

While conducting virtual sprints was a direct response to the limitations imposed by the Covid-19 lockdown, we learned that virtual sprints may be useful when gathering a group in our studio is not feasible. Hopefully, you will find this information useful when you need to conduct a remote or virtual UX event.

Remember, this is a living document that will benefit from your experiences – both successful and unsuccessful. Feel free to add comments or questions.

April Newsletter – 2020

Header1

DIRECTOR’S COMMENTS

I’m often asked what AF CyberWorx is, or what kind of Cyber products we develop. My reply is always that we are a problem-solving organization. It doesn’t matter if the problem involves Cyber, C2, Personnel Recovery, Logistics, Training, Personnel, or even the Mental Health problem. We are about to tackle them all with a few brand new 2Lts and USAFA Cadets. We believe that Human-Centered Design is applicable to any discipline, not just software development. Any problem can and should be focused on the user experience; otherwise, why bother? As far as development, the only Products we create are the Minimum Viable ones. They can be a software application, tech solution recommendation, process re-engineering, or a simple white paper. As for our methodology, we do not start anything without the “problem owner,” i.e. the stakeholder and user. If given time, we’ll do as much research and as many discovery calls as possible. We’ll bring in users and problem-owners, problem stakeholders, industry, and academia. We put a priority on finding and involving the solution sustainer right from the get go. This is the best team make-up to ensure we are going after the right problem, identifying the most viable capabilities to meet the user needs, and collaborating on a Minimum Viable Product (MVP) to ensure the user gets exactly what they want. For the proverbial cherry on top, the sustainer is able to iterate on that MVP and be sure they’re working on the right solution. Respect to the sustainers who know and appreciate MVPs! While we may not have vast resources or big-time shout-outs on social media and tech conferences, we are a small team of doers punching well above our weight class and quietly delivering solutions for some of the DoD’s most complex problems. Hopefully this helps explain a bit about AF CyberWorx and what we are about. We look forward to solving a problem with you someday! Q1wins

OpsAI – Government and industry organizations networked and explored operational use cases of artificial intelligence with the possibility of contracts, SIBRs, and design sprints as paths forward.


SHoCnAwe – Twenty five government and industry participants collaborated in Las Vegas to identify the future capabilities of the Shadow Ops Center.

BizInt – Fifteen participants from Blue Horizons examined the needs of contracting personnel working on deployment requirements.


COVID-19 response – We moved to a virtual office, 3D-printed mask extensions, and shifted current projects to virtual design sessions.

Futurewins

Archer – Air Force exercise developers and analysts are crafting a solution for creating, tracking, and analyzing exercise scenarios and datapoint information.

Morpheus CSfC  – Government and industry participants will explore and ideate on the utilization of Commercial Solutions for Classified.

Six Degrees of Kevin Beacon – The AFRCC is improving how they assess and manage emergency response beacon signals for faster response time and continuity of operations.

Smells Like Convergence – Key stakeholders will collaborate on solutions for converging information and unit capabilities to optimize 16th AF’s Information Warfare mission.

Radar Modernization – ACC and LCMC need possible solutions for sustaining NORAD/NORTHCOM’s early warning radar capability for the next two decades.LEARN MORE

TechinTransition
CRECYBER RISK ECOSYSTEMCRE is a multi-domain C2 tool assisted by AI and machine learning, measuring and conveying cyber risks to commanders.
AUTOMATED READINESS FORECASTINGCommanders need an interactive tool to suggest corrective actions and view readiness information. ARF
OptimusOPTIMUSA mobile app that helps instructor pilots notate, keep track of, and update flight evaluations in-flight fast and accurately.

DONE RITE

At AF CyberWorx, we pride ourselves on our capability of Rapid Iterative Testing and Evaluation (RITE). RITE, by its very definition, iterates gathering requirements, designing prototypes, user testing those prototypes, learning what did and did not work from real users, and repeating throughout the design and development process. This increases the functionality and usability of the end product as well as improves the overall quality for the end user.

The RITE process has been used by our designers to great effect. With Optimis, instructor pilots track training details while still in the air with an intuitive mobile application. Commanders gain actionable situational awareness of their unit’s readiness and needs with the Automated Readiness Forecasting Tool. The Digital University maps goals and paths to success for cyber professionals, and the as-yet unnamed business intelligence tool gives contracting personnel a powerful application to easily capture and share important information across the operational area from planning to inspecting. RITE enables rapid improvement during the design and development of a project through user testing and immediate improvement.

What can RITE do for you?

ARTIFICIAL OR AUGMENTED INTELLIGENCE: WHICH SOLUTION IS RIGHT FOR YOUR PROJECT?

Artificial Intelligence (AI) is the latest ‘gadget’ that companies want to add to every project. However, it’s not the silver bullet that many folks hope it to be. It has its strengths and limitations. To overcome AI’s limitations, Augmented Intelligence optimizes the strengths of Artificial Intelligence, Machine Learning (ML), and human capabilities.

The fundamental limitation of AI is not the programming, but the design of the AI algorithms. Artificial Intelligence is a rules-based solution, and a traditional rules-based AI requires complete and accurate rules. It’s up to the designers to accurately define the right rules. Historically, humans are not very good at predicting every eventuality or possibility, and thus do a poor job of specifying the rules.

And then there’s the issue of the data used to train and operate an AI system.

Humans routinely arrive at successful solutions based on ambiguous, inaccurate, and incomplete data. Computer and AI systems require accurate data to derive an accurate solution. Unfortunately, inaccurate data is a common occurrence, and it’s not always possible to identify the inaccuracies. For instance, there may be inherent biases in the data that will alter the results. These biases may be created by virtue of how the data was collected or due to an unrepresentative data set.

Accurate data is critical, especially if the tool includes a Machine Learning component. A learning engine is only as good as the data it learns from. A good AI solution should include a learning engine to constantly evolve its rules based on actual usage.

Creating a training data set requires a lot of human processing, which may account for much of the unintended biases in the data set. A human decides what data to use to train a learning engine. Without knowledge of how the training algorithm works, they may exclude data that would otherwise be useful and include data that confuses the learning engine.

Besides data accuracy issues, AI systems are not designed to understand inferences as well as humans. An AI engine might be good at taking ice cream orders, such as a chocolate ice cream cone with candy chunks on it, but the same engine would not know how to handle a request for a cone, “Just like that one only with sprinkles.”

AI systems are designed and trained to perform specific functions with specific data sets. They are not yet sophisticated enough to generalize across all inputs to learn everything. Humans are uniquely capable of transferring knowledge across domains to operate effectively in a novel environment based on what we have learned in other, non-related environments.

Humans have limitations, as well. Humans just don’t have memory processing capabilities as good as computers. We are not able to process large amounts of data, we cannot keep track of many things at one time, and we cannot recall things with perfect accuracy. These human limitations are AI strengths.

So, one of the big questions is how to leverage the strengths of humans to overcome the weaknesses of AI, and vice-versa. Augmented Intelligence is a model that emphasizes the assistive role AI can have in enhancing human cognition rather than trying to mimic or replace it. A good example of the use of Augmented Intelligence is in the Automated Readiness Forecasting (ARF) tool designed by the AF CyberWorx team.

ARF accepts mission parameters, such as an exercise several months away, and schedules qualification events to ensure that appropriate personnel and resources are available for the exercise. It also keeps track of all necessary equipment maintenance as well as personnel training and medical activities needed prior to the exercise. It suggests schedule changes which will ensure the required resources are ready in time. Affected personnel only have to approve or acknowledge any adjustments. The tool then tracks progress, highlighting any deviations to the plan and suggesting corrections along the way to stay on track. Machine Learning uses the repeated iterations and changes to evolve the scheduling algorithm, fine-tuning its capabilities and reducing reliance on human intervention.

The readiness tool assumes the mundane tasks humans would otherwise have to perform to keep track of readiness data and immediately adjusts the plan to accommodate any changes to resource availability. Normally, such adjustments would require dozens of people working hours to accomplish. Humans are still required to approve changes, but are relieved of the time-consuming and error-prone tasks associated with adjusting schedules manually. This tracking and automation saves hundreds of resource hours, eliminates errors, and ensures a higher level of readiness.

This example illustrates an Augmented Intelligence and Machine Learning paradigm that could be applied to many Air Force projects rather than a typical Artificial Intelligence approach. The point is, AI is not a panacea for all problems. AI is successful when the problem domain is well understood, the data set is accurate, and the AI is focused on a specific task. Knowing when to design for an Augmented Intelligence verses an Artificial Intelligence – and the difference between the two – is crucial to success. Not every problem needs an AI solution.

*The postings on this blog reflect individual team member opinions and do not necessarily reflect official Air Force positions, strategies, or opinions.

JANUARY 2019 NEWSLETTER – #AFCTM SPRINT DELAYED, YET SUCCESSFU

AFCTM Email Newsletter AF CYberWorx

COLLABORATION

Last week, we had a slow start due to the snow storm on Tuesday; but, AF CyberWorx was able to recover and have a successful #AFCTM Sprint. Six teams split into groups of six and worked together to solve challenges related to Air Force’s Cyber Talent Management. 

We’ll have a press release and report prepared in a few weeks on the solutions that were proposed, but in the meantime, check out the photos taken during #AFCTM on our Flickr page. 

View Now!

Innovation Jan Email News 2019

VISIT US AT THE ROCKY MOUNTAIN CYBERSPACE SYMPOSIUM 

INNOVATION

From February 4-7, 2019 we’ll be at the Broadmoor in Colorado Springs for RMCS in booth 65. RMCS provides a national forum for industry and government to work together to help solve the challenges of cybersecurity, community cyber readiness, and national defense. Join us in the conversation to learn how you can work with us. 

Join Us ?

Networking Jan Newsletter AF CyberWorx

JOIN US AT COLLIDER ON CATALYST CAMPUS

NETWORKING

Connect with our government and industry personnel at Catalyst Campus for a networking happy hour event on an RMCS evening. Collider is an open house where guests can gain insight on AF CyberWorx projects of the past and future. We can’t wait to see you on Wednesday, February 6, 5-7 pm at Catalyst Campus Co-Lab Kitchen & Railyard.

RSVP Now ?

Upcoming Challenges AF CyberWorx Jan News

THE AIR FORCE CHALLENGE

Join AF Cyberworx and the Air Force Research Laboratory in the Vice Chief’s Challenge. 

AF Challenge VCC

The Vice Chief’s Challenge is an open competition to solicit innovative ideas to tackle Air Force level problems. This year’s challenge will take on Multi-Domain Operations (MDO). Submissions must be entered into the Air Force IdeaScale before February 28, 2019

Join Us ?

PART FIVE: AFTER THE SPRINT: NEXT STEPS

Dissecting the Design Sprint Event

PART FIVE: AFTER THE SPRINT: NEXT STEPS

A design sprint’s ultimate goal is to improve a situation, whether that means improving an existing process or developing a new product. Not acting on the solutions a team suggests means a design event has not fulfilled its purpose. AF CyberWorx continues working beyond the sprint itself to help the results of a design sprint reach implementation.

The last step of the event is the outbrief where participants present their findings to the stakeholders. With team suggestions in mind, AF CyberWorx uses their own crack team to work with stakeholders to determine the next steps.

Greg Bennett, AF CyberWorx Relationship Manager, explains, “Every single sprint is going to be different. The outcomes are going to be different. The resources required on the back end are going to be different.” As such, AF CyberWorx customizes a roadmap to the end result according to the needs, limitations, and available resources of the problem owner. “We will continue to stay engaged as they want us to stay engaged,” Greg reassures. However, “We will not fight the fight for them.” Even with CyberWorx assistance, “Transition will not happen without [the champion’s] direct involvement and advocacy.”

Each champion and problem owner has different levels of capability and needs. As such, AF CyberWorx has different tools at their disposal to help. While some projects proceed best after transitioning to a different Air Force agency (which is an option), others are best served by going through a research and contracting process to get industry assistance. AF CyberWorx acts as a bridge between the problem owner and the contracting office to assist with that.

Once the problem owner and AF CyberWorx decide to pursue contracting options, Erica Wilson, Contracting Officer, and Casey Pehrson, Contracting Specialist, go to work. Erica explains that with AF CyberWorx acting as their technical point of contact, “we connect market research to figure out if anybody’s doing what they’re wanting done, who’s doing it, and what types of businesses are doing it. From there, we decide on a contracting vehicle.” The contracting office looks for the most streamlined avenue based on available information, resources, and the problem owner’s timeline. To best do that, as Casey says, “We just need to come together as a government team and find the best way forward.”

The problem owner can help that process most by clearly defining their wants and needs: what they need, what they’re trying to buy or do, and what their limitations are. Information from the design sprint really helps with this, but the more specific the project parameters, the better the contracting office can find the best paths forward.

“Transition [to the end result] requires continued involvement with the customer,” Greg stresses, “to formulate strategies…and advocate with leadership, program offices, and sustainment functions.” It takes a dedicated team to champion a solution and transition it from the idea stage to implementation. From the discovery call laying the groundwork through the problem solving process to drawing out a roadmap to implementation, AF CyberWorx helps guide the process by connecting the right people at the right time. While we don’t do the fighting, we coach problem owners as needed to refine and mitigate Air Force challenges.

*The postings on this blog reflect individual team member opinions and do not necessarily reflect official Air Force positions, strategies, or opinions.