Data analytics can be a tough topic to teach. There's no real road map for instructors to follow, and the technology that underlies it can change at a rapid clip. What's more, students often have difficulty grasping the concept of critical thinking — the crux of data analytics — or understanding why critical thinking is even important.
To get past these hurdles, some accounting faculty are using teaching models to guide their students through the process of data analytics. These models comprise steps that students can follow, and they provide faculty with a more structured way of teaching data analytics in their classrooms. Here are two models that faculty have used successfully:
The SPARKS model
Ann Dzuranin, CPA, CGMA, Ph.D., KPMG Endowed Professor of Accountancy at Northern Illinois University, along with co-authors Margarita Lenk, Ph.D., associate professor in the departments of accounting and information systems at Colorado State University in Fort Collins, and Guido Geerts, Ph.D., professor of accounting at the University of Delaware in Newark, have created a model called "SPARKS" to help accounting educators teach data analytics to students.
SPARKS is an acronym for six "critical-thinking steps," Dzuranin said, as outlined below:
- S is for stakeholders, or the people who will be impacted by the analysis. Understanding the potential impacts guide our analysis and communication choices, Dzuranin explained.
- P is for the purpose of the analysis. Here, students must develop the main question that needs to be answered. If a business is losing money, a basic question would be, "Why are you losing money?" Dzuranin said. But accounting students need to think more deeply. They might ask, "In what areas is this business losing money and why?" and then perform an analysis that would help them answer that question, she added. Without such a guiding question, students can get lost in the data.
- A is for alternatives. Since data analytics seldom generates black-and-white answers, students must keep their minds open to different — and often better — ways of doing things. That could mean accepting an alternative result presented by a fellow student or group or doing a second analysis.
- R is for risk. Students need to be aware of risks that could lead them to make the wrong conclusions. These risks can include stakeholder biases, their own biases, flaws in the data, choosing the wrong type of analysis, or making false assumptions or bad decisions.
- K is for knowledge. Students should know what information they need to perform the analysis but also what additional knowledge they lack and how to obtain it, either through research or other methods.
- S is for self-reflection. To complete the SPARKS process, students must take a step back and reflect on what they learned, what worked best, and how they might use that knowledge in a future situation.
This framework, Dzuranin said, helps students think through each stage, develop better questions, interpret the data, stay focused, and relay that information to others in layman's terms. The steps don't necessarily need to be taught in order, she said, but students should go through them all, one by one, as each is an important component of the data analytics process.
She advised accounting faculty to use the SPARKS model often so these critical-thinking steps will become second nature to students. "For anything you are teaching, you can ask students to think about who the stakeholders are, what is the purpose, what is the knowledge required," she said. "We can help students by reinforcing the concept throughout the semester."
Kimberly Swanson Church, Ph.D., director of the School of Accountancy at Missouri State University in Springfield, Mo., who uses the SPARKS model in her classes, likes how it "provides a framework to guide discussions and helps walk students through the process when applied to cases used in class," she said. She introduces the model at the beginning of her course, and then she and the students build projects around it. She first breaks assignments down into steps and then later lets students work through a case on their own using what they have learned. "Students appreciate the structure of the steps," she said.
Dzuranin and her colleagues have written a soon-to-be-published textbook that will include the SPARKS model. She advised other accounting faculty interested in SPARKS to contact her or her co-authors, or check out resources on critical thinking from the Foundation for Critical Thinking, or from the AICPA.
The design thinking model
The design thinking model is a formalized process for developing designs for products and services. It has been applied in many different fields, but it can also be used as a step-by-step outline for teaching and learning about myriad subjects, including data analytics. Cathy Scott, Ph.D., associate professor of business accounting at the University of North Texas at Dallas, and her research partner Markus Ahrens, CPA, CGMA, professor of accounting at St. Louis Community College in St. Louis, Mo., currently use the model throughout their data analytics course.
Scott said it's a good fit for data analytics "because it helps students through what I call the 'retraining the brain' process," she said. It encourages them to "think more creatively and more innovatively, and not fear failing," which can often be a problem for beginners, she said.
It is also a "more rigorous approach to problem-solving than the traditional problem-solving methods," she noted, and it forces students to think about multiple possible solutions that will meet clients' needs—instead of the traditional "one right answer" in education.
The design thinking model incorporates five steps. For faculty members using the entire design thinking process, these steps should be completed in the order shown below, Scott noted. However, instructors can "use any of these steps with their current teaching models," she said. The steps are:
- Empathize. People using the model (in this case, students) determine who will be the end user or client of their analysis, and they identify with that end user before moving forward. This is also often called the discovery stage, where users of the model start researching and understanding the challenge.
- Define (also called the interpretation phase). Students determine what problem they are trying to solve and determine their "problem statement."
- Ideate. Students generate and refine ideas and learn that having a large pool of ideas may help them determine the best possible outcome.
- Prototype (also called the experimentation phase). Students create prototypes for testing (in product-related cases) and see the benefits of what they are proposing. The prototype phase in a data analytics course "involves preparing the analysis for feedback and reflection," Scott said.
- Test (also called the evolution phase). Students implement the feedback they received about their prototypes and reprocess their data and analysis if necessary before "communicating their final results and/or proposed solutions," Scott said.
Scott introduces the design thinking model at the beginning of her accounting data analytics course and then applies "pieces of the process to activities until students have worked through the entire process," she said. "Each time a new step is added for an activity, students are applying the previous steps first, then adding the new step." On the final project, students apply the entire process.
In her course, all activities involve end users. In one activity, for instance, she gives students a dataset with some criteria. She then asks them to present their final results to the fictional CEO and management team of an organization. Next, they present these results to a client. "Students quickly realize that the visualizations and approach for these different end users often need to be modified," Scott said.
During the design thinking process, students must "empathize with each end user and consider their needs," Scott said, and ask such questions as "What concerns do the end users have?" and "What problems are they trying to solve?"
In the prototype stage, Scott occasionally invites outsiders — industry experts, alumni, or faculty from other fields — to address the classroom and provide feedback for students. These outside sources help students determine whether others "understand the analysis" they came up with and agree that they answered key questions "in a meaningful way," she explained. The sources also provide recommendations that help students to "consider alternatives, refine or improve their analysis, and better communicate their end results," she said.
During the test phase, students evaluate the feedback they received. Sometimes, she said, they learn that they require additional data that is currently unavailable. "In those cases, students discuss what data they would need to collect (or request) in the future, what additional questions they could ask from the new data, and what the new results might do to benefit the end user," Scott said.
Scott recommends that other faculty members interested in design thinking use David Lee's book, Design Thinking in the Classroom, as a resource because it was written specifically for educators.
— Cheryl Meyer is a freelance writer based in Minnesota. To comment on this article or to suggest an idea for another article, contact Courtney Vien, a JofA senior editor, at Courtney.Vien@aicpa-cima.com.