Case Studies

Proficiencies:

Elucidat, Snagit, Rise, Articulate, Captivate (formerly certified), Google Suite, MS Office, Confluence

While I have completed many instructional design-related projects since my first board and card game attempts back in 2013, I have done so in a professional capacity since November 2020. 

I am keen on scalability and maintenance. Where possible, I lean on creative and multimedia elements for my learning experiences. 

Case Study: Call-to-Action Tutorial System in Cosmic Shore

At Froglet Games, I created a Call-to-Action tutorial system that uses regular rewards, contextual presentation, and no text or speech to guide players through hidden tutorials of nearly every mechanic the game has to offer. 

The measure of success have not yet been set in stone, but key metrics include:

  • How many tutorials are completed within the first few hours of play
  • Do players continue doing more tutorials after the first 2 required ones are done
  • How much in-game time and number of logins are counted before a learning plan is complete
  • When we have the knowledge base completed with written versions of these tutorials, how many full views do we have compared to the number of completed learning paths covering the same topic?

Instead of text, scenario, in-universe, or codex-based tutorials, I created my unique Call-to-Action tutorial system.  

Project Requirements and Context

I tasked myself with building tutorials for our game, Cosmic Shore. After discussions with our CEO and gameplay designers, I concluded that the tutorials had to be easily localized, not very noticeable, and experiences that most players had to finish without them feeling forced to do them. They also had to be scalable in terms of size, easily adjustable as the user interface changes, and only apply specific tutorials to the right learners. 

Implementation

I worked with our gameplay developer and backend engineer to create the Unity assets and scripts to connect a trigger on a UI icon that advances the tutorial and gives the player a small but clearly advertised reward. Upon testing the usability of the tutorial, I confirmed it worked as intended, could be applied to any user interface icon or action in response to player input, and scale rewards appropriately to fit within the in-game economy.

I then built the tentative wireframe for the Call-to-Action system based on the learning experiences I wanted most players to have. This wireframe has undergone enormous iteration. It started as 3 rows, once grew to 12, and currently rests at 8 learning paths. 

To show a Call-to-Action trigger to players, I frequently used green, gradients, pulsating animations, and creased corners to build that familiarity with the triggers. 

Challenges, Findings, and Outcomes

Looking back, I am very happy I chose this approach to tutorials in Cosmic Shore. Knowing that vastly more complex features will be on the way after launch, I at least have the simple, nearly intuitive bases covered with these learning plans. I am proud that they are enormously accessible, scalable, and recognizable. It is a lot of work to wire up the learning plan in the application instead of just having a textual pop-up tutorial trigger once per player. But from a testing, localization knowledge, and quality perspective, I feel that the Call-to-Action system is a very compelling tutorial system.

Usability testing the Call-to-Action system showed me many things:

  • People really enjoy receiving rewards for doing things. Even if it is very small, so long as it is tangible and immediately usable in-game, they can accept relatively large inconveniences in their chase of those rewards.
  • If the UI undergoes significant change, the system has to be updated and maintained. If a link in the chain is broken, it can ruin the triggers for the remaining learning paths, which is a major problem. I can partially work through this with a system I designed that counts game log-ins and advances certain tutorials automatically based on how many times people logged in or did a certain action, essentially skipping certain tutorials. 
  • Establishing strong, unique, and clear iconography and all its possible variants is critical for the system to work. I believe certain animation styles that look unique to other menu animations and styles help players understand where to click next.

While not fully implemented, I know enough about planning, building, and updating the Call-to-Action to draw educated conclusions about the system. Building a matrix in Unity that updates and flags broken links when UI updates are made will be the next most important step beside building the system in the first place.

Our user interface has been undergoing a complete overhaul for a number of months, so I am waiting until that is complete to implement new learning paths.

Case Study: nCino Release E-learning Review and Update System

At nCino, I was tasked with solving how to update our Certification Product Release e-learning as efficiently as possible. I created 3 complex spreadsheets that solved this problem very well, even after heavy review by senior IT, Product, and Technical Writing managers. Due to nCino’s privacy rules, I cannot show the sheets I created, but I can describe them in detail with graphical representations of their capabilities.

After reviewing the needs and limitations of the situation, I determined the following things:

  • Our team likely had over 200 significant updates to make to over 30 e-learning modules
  • We needed SMEs for demos, images, and review
  • Work had to be delegated to the correct Instructional Designer who owned the relevant e-learning
  • Every module's stakeholder had to be pinged about prospective updates and update completion
  • It would be very helpful to automatically share relevant modules and resources with the tasked instructional designer
  • Every product e-learning module had to be cataloged and given content tags in the spreadsheets
  • The Salesforce-based release notes had to be scoured for keywords with which to trip an e-learning release update ping
  • I could only use existing nCino tools and software licenses to make the system, no additional costs would be accepted

After 5 months of being assigned higher priority projects than updating existing product e-learning, I was tasked by my director and manager to figure out a system of updating our e-learning using product release notes.

Given the unknowns and the lack of success other employees had with solving this problem, my seniors primarily just wanted me to figure out how to quickly and easily update our product e-learning. After extensive discussion, I determined these more precise measures of success:

  • Representatives from all reviewing or contributing parties must consider it an effective and elegant system
  • All product e-learning can have an update need recognized and have updates made relatively easily
  • Updates can be completed within 2 weeks of the monthly release notes being publicly available
  • The 5 months of missed updates could be completed using this tool
  • Few to no e-learning modules would have updates missed due to the spreadsheets not recognizing monthly release updates correctly

Project Requirements and Context

The Product Enablement sector at nCino was a very complex, challenging, and ever-changing area. Some halves of the year it would focus on maintaining our 9 certification learning curricula and paid exams, other times it would be just-in-time training for new nCino customers. Once the company moved to a monthly software release update cycle, updating existing e-learning became even more difficult.

    Implementation

    I first attempted to build solutions using Microsoft Forms, LucidChart/Spark, Confluence, Jira, Microsoft Planner, and a few other programs. Nothing worked as well as I wanted. I yielded the most early success with SmartSheet, as nCino’s license allowed incredibly useful connections and integrations between SmartSheet and our other softwares. I decided to stick with SmartSheet after that initial testing. 

    After testing with SmartSheet, I had these findings:

    After around 2 weeks of testing and iterating on SmartSheet options, I wireframed a solution involving 3 interconnected SmartSheets. Each one played a unique but critical role in the total system, detailed here:

    • E-learning Content Database: This first spreadsheet listed every Elucidat and Articulate Rise-based product e-learning module ever made at nCino. Each module, represented by a row, is given tags related to certification exams, certain keywords, and association with specific products and solutions, among many other things. I would scan each monthly release note document for database keywords. When each keyword is applied to this spreadsheet, it highlights each cell with a module containing that keyword. If that row is automatically highlighted, it pasted into spreadsheet 2.
    • Working Document: This second spreadsheet is fed all the highlighted cells from the first spreadsheet. I assigned each row to a specific Instructional Designer, depending on their schedule and their ownership of specific solutions or products. When an Instructional Designer is assigned a row, they are emailed with all relevant resources for updating e-learning. If that Instructional Designer decides the module needs updating, they click a button to ping product stakeholders and SMEs. Otherwise, they put the row in a lower part of the sheet to not work it that month. Once the month is  complete, I send these rows to the third spreadsheet and prepare this doc for the next month’s updates.
    • Historical Updates: This third and final spreadsheet holds every update made or declined for each monthly release cycle. It includes the reason behind an update being declined. Each sheet page represents one month of updates. One page on this sheet can be used for powerfully and precisely filtering the work among any number of modules among any variety of months.

    I worked almost entirely independently during this project. Collaboration occurred only during the later major review cycles. I was still working on new product e-learning, QA, and process guide projects during this time. I estimate I spent around 8-12 hours each week on this review system.

    After building and iterating on the system over a 2-month period, and spending a few weeks reviewing it with resident experts, I presented the final product to a large group of instructional designers, managers, IT specialists, and product stakeholders. They were all impressed and pleased by the system and had no reservations about the final product.

    Challenges, Experiences, and Findings

    I am really proud of this review system. I had never worked on a bespoke system of this magnitude or type, but I think it works really effectively. In its final stages, every person reviewing it, including senior individuals I consider far more intelligent and capable than me, thought it was basically perfect and saw no unnecessary flaws or ways to improve it given the constraints I had.

    Here are the challenges, experiences, and findings I have for each spreadsheet:

    • E-learning Content Database: I may have spent almost half the project placing modules and metadata in this database. It was brutally mundane work, but having well-organized data to reference while working with a finished system made a world of difference. If the review system was used, this would have paid itself off in work hours after just a few update cycles, I think.
    • Working Document: This second spreadsheet took quite a bit of iteration. I ended up making them disposable in that, once we updated modules for a month and placed this sheet’s data in the Historical Updates sheet, they wouldn’t be used anymore. This was not intuitive to me, and only became an apparent good option towards the end of the review system build.
    • Historical Updates: The filtering page in the Historical Updates spreadsheet proved one of the more useful tools in the review system. It helped us defend our work, offer detailed metrics, and remind ourselves about how often in a year we updated our modules, among many other benefits. I received the most praise on this particular part of the system from nearly everyone who reviewed the sheets. I spent many hours updating the filters to offer more functionality. This necessitated updating the database with more metadata and tags, which was a very substantial initial lift.

    I tested this tool alongside our Principal Instructional Designer on one month’s updates and it worked exactly as intended. She was very satisfied with how much easier it made everything surrounding the actual module updates. In comparison to how we used to update release modules, we felt that it almost doubled our updating speed.

    Unfortunately, due to tumultuous product updates, organizational changes, and deprioritization of our product certification e-learning, this review system was never put into real action. It remains in its fully functional state to this day. Regrettably, I never had an opportunity to see or contribute to a real monthly release update cycle with it. In lieu of using this system, our managers chose for us to update product e-learning only if directly requested by a stakeholder, customer, or SME.

    No matter what, an instructional designer would still need to manually update our certification e-learning. Despite some keywords in each module being included in one of the sheets, there would almost always be multiple modules that had to be investigated for update opportunities. Updating product e-learning is just an enormous lift.