THERE ARE ATTACHED FILES TO HELP WITH THE ASSIGNMENT
PLEASE LET ME KNOW IF THE PDFs WILL LOAD FOR YOU!!!
For this discussion, imagine that you are an instructor teaching an introductory course to a group of incoming first-year students. It is your first day, and you want to provide an introduction to what they will study in your course. review the
Week 1: Defining the Discipline of the CourseLinks to an external site.
reading list and then answer the following questions in your initial post:
- How would you describe the field of study?
- How would you differentiate it from other related fields?
- What methods would you employ to help students with limited experience understand this field of study at a higher level?
14
Turkish Online Journal Distance Education-TOJDE October 2014 ISSN 1302-6488 Volume: 15 Number: 4 Note for Editor
EAGER ADOPTERS IN EDUCATION:
Strategic Plan Ideas for Integrating Instructional Technology
Jace HARGIS, PhD
San Diego, CA, USA
ABSTRACT
Strategic plans for teaching and learning are essential, however, they tend to focus on
moving a mass of stakeholders along an agreeable path. The strategy is necessary and
sensible, although many times, these plans miss a key audience important to the future
of education, the eager adopters. Previously, this group was called “early adopters”,
however, I believe that the time in which educators become involved is not as important
as their eagerness. This philosophy follows Thoreau’s notion that, “If a man loses pace
with his companions, perhaps it is because he hears a different drummer.” It is those,
who hear different drummers that may appear to be tangent to institutional missions,
although they may actually be leading initiatives, which the institution may eventually
adopt. Some of the eager ideas which will be shared in this paper include Social
Emotional Competency, Digital Content Creation Ecosystems, MOOCs, play with purpose
maker economy fabrication labs, Scholarship of Teaching and Learning, big data learning
analytics, wearable technology, the quantifiable self, the internet of things, and mobile
learning. The paper describes these eager adopter ideas aligned to the 2014 NMC Horizon
report; eager adopter philosophies; and eager adopter questions to help initiate and
guide strategic planning discussions.
Keywords: Eager adopters, technology, mobile learning, strategic plans, instructional
technology.
INTRODUCTION
A model frequently referenced by eager adopters is the Elements of the Creative
Classroom Research Model from the 2014 New Media Consortium (NMC)/Educause
Horizon report (http://cdn.nmc.org/media/2014-nmc-horizon-report-he-EN-SC )
(Johnson, et al, 2014). This model provides a broad organization of innovative
pedagogical practices. Eager adopters are able to independently consider ideas from this
model, which they typically have observed or could envision using in their classrooms. An
ideal method for addressing many, if not all of these practices would be to bring together
a cohort of eager adopters as a think tank to share their ideas and subsequently create a
map of practices, which they currently use, plan/hope to use and would like to use, but
perhaps cannot due to limited resources.
15
I will use this model as an advance organizer for this paper, identifying particular aspect
of the model, which align well with eager adopter philosophy and describing potential
eager adopter ideas for strategic plan consideration.
Content and Curriculum (Emotional Intelligence,
Cross/trans-disciplinary, Open Educational Resources, Meaningful Activities)
In this category, an eager adopter would be interested in building a Digital Content
Creation Ecosystem (DCCE), which allows faculty to upload media-rich, instructional
strategies into an open-source platform. The concept-based active instructional plans
would then be crowd source-assessed, edited, improved and returned to the open system
to be continually updated and improved. In addition, best-practice instructional design
suggestions based on foundational learning theories would be provided online to help
faculty create electronic learning objects (eLO’s) as Open Educational Resources (OER).
The open environment would also include copyright free graphics, video’s, simulations,
and apps. As education begins to use more digital tools and information, the need to
create an efficient collaborative learning DCCE platform will be essential.
To address, the Emotional Intelligence attribute of this category, an eager adopter would
be searching for an institutional Center for Social Emotional Competency (SEC) to
collaborate and use as a vehicle to promote their work beyond as as a distinction to their
disciplines. Typically SEC is thought to have at least five distinct attributes, which include
self-awareness, self-regulation, intrinsic motivation, empathy and social skills. A proper
Center would include a focus on embracing failed events and instilling grit as enduring
characteristics. Emotional Intelligence has been popular in the engineering circles since
Goleman’s (1996) book on Social Emotional Intelligence. The thought was incorporating
social skills for working engineers may assist in their conversations on analytical topics
with clients. The “intelligence” has been researched further and many believe Emotional
Intelligence is a competency, hence the updated terms Social Emotional Competency,
which can be taught.
Assessment (Engaging formats, Formative, Informal learning)
Generating an electronic portfolio process, leveraging technology to represent learner
abilities, which would have been difficult or impossible to do without technology
previously? E-Portfolio’s have been idealized for many years, however, as of yet, a
complete, functional package has not been developed.
Several universities have attempted, however, their efforts have been abandoned.
Textbook and software companies have attempted, but their lack of experience and high
cost have prohibited adoption. Learning Management Systems have taken a similar path,
originating in the post-secondary, and now common in the secondary schools.
This provides for an easier transition from secondary to post-secondary for learners,
which is especially helpful during a stressful time, when many parts of their education is
new. For revenue, we could copyright and market to secondary and post-secondary
institutions.
16
An idea that capitalizes on the power of both formative assessment and informal learning
for eager adopters would be to make mobile tablets with screencasting apps readily
available. Soto and Hargis (2014) found that by asking students to create screencasts on
an Apple iPad mobile tablet on how they solved mathematical problems, the researchers
could capture the learner voice and actions in an efficient, easily managed audio file.
Upon further examination, the researchers were able to more deeply understand how
students were processing and making sense of math.
The screencast app allowed the ability to view when and what was erased, along with
their verbalizations as to why, which enabled the researchers to correct math
misconceptions, that otherwise would have gone unnoticed.
Review of the screencasts were completed soon after producing, and therefore as a
formative assessment acted as a key piece of information for real time remediation.
Learning Practices
(Exploring, Creating, Playing, Self-regulated, Personalized, Peer-to-peer)
Although Massive Open Online Courses (MOOCs) for large-scale dissemination of our
teaching and learning may not be new to the educational scene, building effective MOOCs
remains a mystery. Most MOOCs now available have simply transferred low level, didactic
lectures into an electronic format and placed in easy to access online platforms. An eager
adopter would bring similar innovative, active teaching methods, which have been
successful off-line into the MOOC world and find creative ways to further engage the
learner. For example, students could be presented with a menu of video’s from which to
select from, each of them providing a different real life problem. After viewing, they are
asked to create an organization to address the issue, share with their colleagues, crowd-
source a consistent and agreed upon procedure. Guidance from the instructor could
suggest that the final student representation of learning outcomes include a media-rich
artifact, using a menu of programs, which could include vodcast, virtual worlds, social
networks, green-screen video technologies, perhaps even find ways to connect to
wearable technology, the quantifiable self or the Internet of Things. This approach would
empower the learner to use skills, which we know have been shown to help learners
engage, retain and use conceptual frameworks, which are exploring, creating and
playing.
Eager adopters are eager mostly because they enjoy what they do and they enjoy what
they do because many of them view teaching and learning as a form of play. Therefore,
eager adopters would thrive at an institution, which could build a highly interactive,
manipulative-based “Play with Purpose” facility which combines the benefits of a
gamification-zone and Fabrication Laboratory (Fab Lab). The space would provide
equipment and materials for users to create physically engineered prototypes, and/or
technology products, such as mobile apps or even educational app packages. The major
theoretical construct typically used in this context is the Information Processing Model,
where the instructor/designer has intentionally built in coding activities to connect
working memory to long term memory.
17
When this is done properly, the learner can decode the information in meaningful ways.
Current cognitive science research by Mauro (2011) indicates the reason why and how
we can incorporate gaming into learning. Key attributes for a successful learning game
include a simple, yet engaging interactive concept; cleverly managed response time;
short term memory management; mystery (i.e., inquiry); and measuring that which some
say cannot be measured. Using learning games is a highly effective way to build
personalized self-regulated abilities.
Teaching Practices
(Soft skills, Individual Strengths, Multiple learning styles, Modes of thinking)
Eager adopters are also eager to share their efforts with a broader community of
educators. One of the best methods to achieve this is to use the current scholarship
model, although with a different audience.
Instead of discipline specific research, a useful model for a technology-educator-scholar
model is the Scholarship of Teaching and Learning (SoTL).
A common SoTL model is the Ernest Boyer (1990) model. Boyer proposed a broader
definition to traditional scholarship in the academe. His model incorporated four
distinctively different types of scholarship, the traditional scholarship of discovery; the
scholarship of integration, synthesizing information across disciplines; the scholarship of
application, or engagement; and SoTL, a systematic study of the teaching process using
accepted social science design study models.
SoTL empowers teachers to systematically collect data on the effectiveness of how they
integrate instructional technologies, as well as provides a credible, evidence-based
method for sharing the results with their colleagues in a language accepted by the
academe.
In addition to exploring how technology works in their class and discipline, SoTL also
provides a springboard to explore soft skills, individual strengths, multiple learning
styles, and alternate modes of thinking.
An ideal way to accelerate faculty engagement in SoTL activities are “Learning
Laboratories”, where faculty experiment with teaching strategies which integrating
appropriate, functional, meaningful instructional technology. Typically, these labs are
classrooms, which are retrofitted with exploratory technology, as well as staff, who could
assist with new, innovative teaching strategies. Equipment could include large multi-
touch screens, mobile devices, remote projection, green screen technology, high end
video-conferencing and recording and multiple ways for learners to present and capture
their learning experience.
Common is the development of a Learning Lab Faculty Fellows program were faculty are
certified, similar to the Apple Distinguished Educator program and then provide
professional development for the university community.
18
Organization (Monitoring quality, Innovative timetables, Innovative services)
Monitoring quality and providing innovative services could be provided by creating
efficient ways to gather large amounts of data and making them available to eager
adopters to aid in real-time decision-making. Eager adopters would appreciate being able
to use big analytical data to aid instructional delivery, perhaps in an adaptive technology
model. The ability to capture what students know as soon as possible, and in a low
threshold way, both for the learner and the teacher to make sense of the data will greatly
assist in the educators ability to reroute students progress if needed. An early adopter
would hope for more useful data than simply examination responses. They would
encourage the ability to easily capture the high quality, media-rich, qualitative
assessment artifacts, which have been created by their students. The ability to quickly
share, peer-review and create aggregate products, as well as provide assessments
compared to inter-rater reliable rubrics would be an ideal way to use the big data
phenomenon.
Leadership & Values
(Innovative management, Social entrepreneurship, inclusion & equity)
Eager adopters search for innovative leadership, many times this becomes a deciding
factor for their employment at the institution. Perhaps, we will encourage more eager
adopting leadership as an additional parameter to this category of Leadership and
Values. The major indicator of eager adopter leadership is for the leadership to model
eager adoption. Some leaders build strategic plans, which propose integrating
innovative, useful technologies, and then they continue to use traditional means to do
business, instead of integrating the technology themselves.
Connectedness
(Networking with the real world, Social networking, Learning events)
Eager adopters are also passionate about continuous learning and many are in dedicated
to the academe because of the opportunity to learn everyday themselves. Therefore, they
would look to an institution to provide on-going, current professional development, such
as offering a Certificate Course on Teaching using Mobile Devices. In the past decade,
there have been programs developed, even Masters of Education degrees addressing
teaching in higher education. However, with the rapid movement of mobile devices in the
classroom, there lacks a specific offering for how to effectively use the mobile tools for
learning (Hargis, & Cavanaugh, 2014). We all know that the key to the success is not the
device, but the method of our deployment and our hyper-focus on front-end faculty
development. Hargis, Cavanaugh, Kamali, and Soto (2013) created a four tier model
which could be used as a basis for a certificate course. The model was generated to lead
to pedagogy portion of the United Arab Emirates large scale federal mobile learning
project. The four steps include iChampion, iCelebrate, iCommunicate and iSoTL.
Infrastructure (Physical space, ICT infrastructure)
Eager adopters will aggressively pursue an highly interactive ICT infrastructure, one that
allows the faculty member to interact frequently, without the need of programing
language capabilities.
19
Eager adopters could then dream and find ways to resource leading edge technologies
and explore how they can advantage educational settings. Possible high end technology-
based initiatives, which would require a rigorous ICT infrastructure include wearable
technology, the quantifiable self, and the internet of things. Google glass is just the
beginning to interface wearing technology and learning.
On the near horizon technologies are currently proposed to allow learners to wear
devices, which can monitor their physical health, perhaps someday their mental and
intellectual health via the quantifiable self. The internet of things will help connect the
data and make available to teachers and other experts.
AN EAGER ADOPTER PHILOSOPHY
Eager adopters focus on developing relationships and learner engagement. To accomplish
these, they focus on creating opportunities, which …
1. provide frequent authentic activities geared towards collaboration, which
parallels the type of behavior expected in and beyond learner careers;
2. integrate functional, meaningful mobile technology, which allow for easy access
to the community, and creation of media rich artifacts representing learner
efforts;
3. collect dynamic, formative assessment data, measured in authentic ways, such as
ePortfolio’s and screencasts, which can be used to redirect real-time
misconceptions;
4. include physical and virtual space with tactile manipulatable’s, such as 3D printers
and electronic building blocks creating a “Play with Purpose” attitude;
5. contain informal micro-teaching areas, and green screens for recording and
editing digital video to provide active practice space for reflection and self-
regulation; and
6. focus on inquiry, project-based learning, where learners are creators instead of
consumers and are allowed to intrinsically develop their own meaningful driving
questions.
Eager adopters accomplish their aspirations through creating engaging learning
environments, which:
provide many casual interactive areas where conversations are supported
between colleagues, students and teachers, and the community;
2. encourage active physical, mental and spiritual interactions, which can be
enhanced with integrated, seamless technology;
3. intentionally integrate the curriculum throughout campus, creating a campus-
wide learning environment that learners can use for challenge-based learning;
4. have supportive leadership with frequent presence and are models of
meaningful technology;
20
5. are ideal examples of sustainability, providing public showcases throughout
the world;
6. are able to see beyond the classroom and knowledge acquisition, creating
holistic learners with much needed social emotional competencies;
7. understand the importance of internationalization and the promotion of
frequent interplay between their students abroad and welcoming foreign
students;
8. reward the Scholarship of Teaching and Learning (SoTL), which produces
internal and external evidence-based literature on best practices in teaching;
9. exemplify a learning culture, which values inquiry, and self-directed learning.
Critical to Eager Adopter success are colleagues, who design and facilitate innovative and
personalized learning experiences because he/she:
1. is an enthusiastic, passionate curious, caring, risk-taker, who embraces failed
events as necessary for meaningful learning;
2. understands that not all students think as the educator does, so the
educator’s role is to design and lead multiple flexible pathways for deep
individualized learning;
3. exemplifies an engaging story-teller according to his/her personal style,
translating a love for the discipline to learners;
4. can “teach with their mouth” shut to open opportunities for a broad student
voice and activity (Finkel, 2000);
5. is respected as an expert, attuned to learner needs, and acts as a coach,
mentor and motivator; and
6. collaborates and shares ideas openly and freely with colleagues for the
betterment of all students.
The most important attribute for eager adopter success is learners who are able and
ready because they;
1. are engaged, curious, playful, open-minded, spirited, responsible, hard
working, goal-oriented, persistent, organized, motivated, independent risk-takers;
2. approach learning as a connected and meaningful experience with peers and
the community and they contribute to the community;
3. develop grit for enduring failed events as challenges that will be overcome
with time, recognize the importance of guided practice, apply the habits of
project management for their learning pathways;
4. try to use a range of tools and resources, including media, games and
manipulatables;
5. have support from teachers, administrators, parents, business leaders and
peers; and
6. are less concerned about grades and more about learning (intrinsic
motivators).
21
EAGER ADOPTER QUESTIONS
Finally, some thoughts on the type of questions, which keeps an eager adopter awake at
night. There are people who believe the “smart ones” are the ones who have answers,
however, eager adopters know that the future depends on those who know the key
questions.
What does a learning environment, educators and learners of the future
look like?
What pedagogies are required to be empower a 21st Century learning
community?
What framework and process will we use to create a progressive vision?
How will we prepare the community for change?
What professional development is needed, how do we deploy and sustain?
What instruments will we create to accurately “measure the difficult to
measure”?
What mobile teaching and learning tools will be required and of whom?
How do we integrate work-style with life-style, while integrating change?
What is the role of new ways to communicate, including texting,
podcasting, virtual worlds, social networking, blogs, and future tools?
How do we prepare for remote learning?
How do we migrate information, access and philosophy to the cloud?
How do we maximize the function of our current LMS, while maintaining
flexibility for when we need to change?
Will we develop rich content management systems to search, store and
share?
What eLearning Objects are available and what is the role of textbooks?
What interactive multimedia components can be created internally and
outsourced?
What are the links to National Curricula, internal and external agency
repositories?
BIODATA and CONTACT ADDRESEES of the AUTHOR
Dr. HARGIS has been a College Director in the UAE; an Assistant
Provost and Associate Professor at the University of the Pacific; and a
Director of Faculty Development and Assistant Professor at the
University of North Florida. He has authored a textbook, an anthology
and written over one hundred academic articles as well as offered
hundreds of national and international academic presentations. He has
earned a B.S. in Oceanography from Florida Institute of Technology; a
M.S. in Environmental Engineering Sciences and a Ph.D. in Science Education from the
University of Florida. His research agenda focuses on how people learn with the use of
emerging instructional technologies.
22
Jace HARGIS, PhD
San Diego, CA 92101
Email: jace.hargis@gmail.com
REFERENCES
Finkel, D. (2000). Teaching with your mouth shut. Portsmouth , NH : Heinemann.
Goleman, D. (1996). Emotional Intelligence: why it can matter more than IQ. London:
Bloomsbury Publishing.
Hargis, J., & Cavanaugh, C. (2014). A one year federal mobile learning initiative review.
Encyclopedia of Information Science and Technology, Third edition.
Hargis, J., Cavanaugh, C., Kamali, T., & Soto, M. (2013). A federal higher education iPad
mobile learning initiative: Triangulation of data to determine early effectiveness. Journal
of Innovation in Higher Education, 39(1).
Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC Horizon Report: 2014
Higher Education Edition. Austin, Texas: The New Media Consortium.
Mauro, C. (2011). Why Angry Birds is so successful and popular: a cognitive teardown of
the user experience. PulseUX Blog Theory, Analysis and Reviews on UX User Experience
Research and Design. Retrieved August 15, 2014 from
http://www.mauronewmedia.com/blog/why-angry-birds-is-so-successful-a-cognitive-
teardown-of-the-user-experience/.
Soto, M., & Hargis, J. (2014). Students ExplainEverything using iPads. ISTE Learning and
Leading with Technology, 32-33.
Copyright of Turkish Online Journal of Distance Education (TOJDE) is the property of
Turkish Online Journal of Distance Education and its content may not be copied or emailed to
multiple sites or posted to a listserv without the copyright holder’s express written permission.
However, users may print, download, or email articles for individual use.
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
360
Teaching Tip
A Scalable Hybrid Introductory
Analytics Course
David P. Darcy
Kelley School of Business
Indiana University
Bloomington, IN 47405, USA
davdarcy@iu.edu
Asish Satpathy
W. P. Carey School of Business
Arizona State University
Tempe, AZ 85281, USA
asish.satpathy@asu.edu
ABSTRACT
We report on the design and development of an introductory analytics course delivered to almost 10,000 undergraduate business
students to date. One novel aspect of the course is its orientation to add analytics capabilities to a business student’s toolbox,
resulting in significant design and implementation implications. We anchored the course on three fundamental principles:
maximizing learning, operating at scale, and a consistent experience for all learners. To enable a rigorous and valuable learning
experience, the underlying course curriculum is based on the modified CRISP-DM (CRoss Industry Standard Process for Data
Mining) framework. Bloom’s taxonomy is applied to the course assessments to evaluate the depth of learning. The course is
delivered in a hybrid mode, arguably the best combination of online and face-to-face delivery modes. In a naturally occurring
experimental setting, the COVID-19 pandemic accelerated the evolution of the course and generated additional reinforcing lessons.
We explore those lessons and suggest directions for further research.
Keywords: Introductory analytics, Asynchronous learning, COVID-19, Bloom’s taxonomy, CRISP-DM
1. INTRODUCTION
The application of analytics in business has seen revolutionary
change thanks to a substantial increase in data availability, an
increase in breadth and sophistication of analytical methods, a
myriad of new tools, and persistent storage and processing cost
reductions, to name some major contributors (Dinter et al.,
2017; Gupta et al., 2015; Jaggia et al., 2020; Schiller et al.,
2015; Wixom et al., 2014). Educating the workforce in
analytics and keeping up with its evolution has become both
more imperative and challenging (Firth et al., 2011; Paul &
MacDonald, 2020; Rodammer et al., 2015; Wilder & Ozgur,
2015; Williams & Elmore, 2021; Zadeh et al., 2018; Zhang et
al., 2020). As a highly cited McKinsey report makes clear,
obtaining analytical knowledge, skills, and abilities (KSA’s) is
no longer simply desirable but becoming a fundamental toolset
for almost any role, function, organization, and industry
(Manyika et al., 2011). There continues to be a widening
workforce gap between the supply and demand of those with
analytical KSAs (Doshi & Krishan, 2020). In recognition of this
unmet need, AACSB has revised its curriculum standards to
encompass Analytics KSAs (AACSB, 2020).
The case has been made that Information Systems (IS) is
perhaps the most appropriate single discipline to develop and
deliver analytics curricula given the already existing
interdisciplinary focus of IS (Agarwal et al., 2014; Burns &
Sherman, 2019). We build on a tradition of analytics curricula
development in IS (Gupta et al., 2015; Schiller et al., 2015; Topi
et al., 2010; Wixom et al., 2014; Zhang et al., 2020) by reporting
on an introductory analytics course.
How do we implement an analytics curriculum that will
satisfy the organizational needs of a pan-disciplinary audience?
We achieve this goal by first introducing the concept of
problem-solving using analytics in a “business function”
agnostic way. Later, we introduce analytical techniques that can
be used to solve these problems in real-world settings.
This paper aims to detail the design and implementation of
an introductory analytics course for undergraduate students
across the entire range of business disciplines. Several aspects
of the course combine to generate a unique context worthy of
further study and provide several valuable lessons. For
example, the choice of a hybrid delivery mode (for reasons
discussed in Section 3.2) was made before the COVID-19
pandemic so dramatically changed the landscape for us all; the
pandemic ushered in and accelerated several additional aspects
of the course implementation. Reflecting on what occurred in
the course before, during, and after the pandemic highlighted
the course’s unique challenges and benefits. Finally, we report
mailto:davdarcy@iu.edu
mailto:asish.satpathy@asu.edu
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
361
the valuable lessons we learned to the larger community of
scholars and educators.
The course discussed in this paper has another novel aspect:
many of the curricula cited have a stated goal of educating for
analytics roles such as data analysts, data specialists, and data
scientists (Wilder & Ozgur, 2015). We are educating the entire
gamut of business students in using analytics for problem-
solving as part of their larger role, whatever that role may be.
Providing an introductory analytics course to all business
students enables departments to offer discipline-specific
analytics courses.
Moreover, changing the motivation of a significant part of
the audience from “I choose to do this course/major/program”
to “I have to do this because it’s a requirement” has major
implications for building course engagement. These
distinctions in audience orientations may be subtle, but they
substantially impact the course design. Some of the design
choices for the course include the potential to spark curiosity
towards using data analytics methods without making students
fully proficient in specific data roles at the end of the course.
This paper describes a novel design for an introductory
analytics course for undergraduate students at a large public
business school in the Southwestern United States. To ensure
rigorous course foundations, we implemented the curriculum
inspired by the CRISP-DM (CRoss Industry Standard Process
for Data Mining) framework; learning assessment is evaluated
using Bloom’s taxonomy. We share the valuable lessons
learned about hybrid learning at scale before and during the
pandemic through delivery to almost 10,000 students.
The remainder of this paper details the course design,
illustrating the reasoning behind essential design decisions,
including the choice to implement a hybrid delivery. It also
describes the impact the pandemic had on the course
implementation. Finally, the paper concludes with lessons
learned and future work.
2. COURSE PHILOSOPHY
We believe that our introductory analytics course presents a
unique combination of design and implementation choices. For
example, we offer the course in a hybrid delivery mode.
Further, the course is relevant to students from different majors
with a broad range of interests, backgrounds, and capabilities.
An active learning orientation in the course design increases
student engagement (Burch et al., 2019; Mann et al., 2020).
Many of our reported course design and implementation
choices are not new, for example, using a hybrid delivery mode
for an analytics course. However, it is the combination of many
design decisions that make our contribution unique; analysis of
these choices, the culmination of five years of experience and
almost 10,000 students, as reported in this paper, is of value to
faculty in similar contexts.
In the following two sections, we discuss how the various
design and implementation decisions were resolved.
Specifically, in this section, we segment the explanation of the
course philosophy into subsections on the development of
course content and learning assessment.
2.1 Course Curriculum Development
A tremendous amount of work has been done in designing
analytics curricula to meet organizational, and educational
needs at various levels of higher education, including graduate
degree programs, graduate, and undergraduate major and minor
programs of study, as well as a variety of standalone elective
graduate and undergraduate courses. Though analytics is a
highly cross-functional field, many of these curriculum
development efforts have taken place in business schools in
general and often in IS departments (Gupta et al., 2015; Schiller
et al., 2015; Topi et al., 2010; Wixom et al., 2014; Zhang et al.,
2020).
The senior leadership of the Business School projected the
critical role that analytics would play in the future, concluding
that it needed to be part of the core business curriculum. A
cross-disciplinary faculty team confirmed the need through
their research and developed the initial curriculum for
evaluation by the curriculum committees. The primary
objective of the course development was to fulfill the need for
a wide variety of roles in an increasingly analytics-rich
professional and business environment. The course serves as a
launch pad for further discipline-specific analytics courses.
Starting in Fall 2017, the Business School added an
introductory level undergraduate analytics course as a required
upper division 3-unit credit hour course for any student enrolled
in any Business School program. Over the past several years,
the system substantially evolved to meet all sorts of challenges
and opportunities, including going from full face-to-face
synchronous delivery to hybrid delivery beginning in Fall 2019.
More than 4,000 learners engage in the course in an academic
year. The necessary coordination among the faculty at that scale
(with up to a dozen faculty across three departments and three
campuses) partially contributed to the hybrid delivery choice.
A series of guiding principles drove many decisions in
designing and implementing the course. The first principle was
to maximize learning; maximizing learning is not about
covering an exhaustive list of topics and extensive assignments
but rather choosing which topics and assessments spoke most
to introductory analytics, given the constraints of a single three-
credit course for a vast audience. This principle led us to
consider active and engaging course content and assessment.
We bring contributions from faculty across the Business School
to introduce analytics in their discipline’s context, ensuring an
engaging motivation was provided for the course. Furthermore,
we use actual data and business cases in labs, assignments, and
group work.
The second principle driving our design decisions was the
consideration of scale. Educating more than 4,000 students
yearly leads to confident, practical choices and obviates others.
For example, as the course ramped up from a few sections in
the early semesters to more than 50 per year, a significant
proportion of the extensive set of assessments had to be
automated. This was carefully done to ensure the same level of
rigor of the assessment while maintaining the timeliness of the
feedback provided.
Finally, the third principle we worked with was ensuring
consistency in the design and implementation to provide a
similar experience to all learners. With more than 50 sections
of the course across as many as a dozen faculty, across three
campuses, and several departments, a very high level of
consistency is maintained around course content, assessment,
etc. In tandem with this principle and to leverage the breadth of
faculty knowledge and experience, substantial coordination
across groups, including faculty, the IS department, the school
administration, and the learning design and support teams, takes
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
362
place to maintain a high quality yet continuously improving and
engaging course.
Guided by these principles, we continue to develop a highly
intriguing and engaging introductory analytics course. It
includes a low barrier to entry to enable a vast range of student
learners to get past the initial motivation challenges associated
with “I have to take this course, as it is required.” As we find
more liberal arts skills in data analytics, such as storytelling and
effective communication of data insights, in its current design,
the course could also be offered to a broader audience,
including undergraduate students from the Arts and
Engineering.
2.2 Course Content
Our current design is based on these overall learning objectives:
• What problems can be solved using analytics?
• How do we analyze and find insights with data?
• How can organizations affect the data creation and
generation process?
• How do organizations generate, store, and organize
data?
Inspired by the CRISP-DM framework for data mining
(Chapman et al., 2000; Jaggia et al., 2020), we developed a
pedagogical framework for teaching the course, as shown in
Figure 1. The CRISP-DM framework is the most
comprehensive guiding principle for carrying out analytics
projects and developing analytics curricula in higher education.
Keeping in mind the scope and limitations of this course, the
proposed framework is a modified version of the original
CRISP-DM. Our framework includes most of the steps in the
CRISP-DM, though not all for our introductory course. For
example, we do not use the “Deployment” phase of the CRISP-
DM framework as it is not feasible to include it given the nature
of the course. The modified framework highlights the interplay
between different phases of the analytics process and shows
how they collectively contribute to an analytics project. A
detailed explanation of each step of the framework follows.
Figure 1. Pedagogical Framework for the Introductory
Analytics Course
2.2.1 Step 1: Business Understanding. Students are
introduced to various business problems, each with different
learning outcomes for which they apply analytical reasoning to
solve. This process evaluates the role of multiple business
functions and their existing interrelationships. Associated data
is typically provided for a specific business problem; in
addition, students are taught how they could potentially collect
data via surveys and secondary data sources.
2.2.2 Step 2: Data Understanding. Knowledge of data type,
data quality, and data insights is crucial at the point where
students learn to experiment with exploratory data analysis and
data visualizations. We use a more comprehensive range of
industry-approved analytics tools than are utilized in Excel-
focused courses (Frost et al., 2021), such as Tableau
(tableau.com), and JMP Pro (www.jmp.com). The additional
tools are just as accessible and have a broader set of data
handling and analytical capabilities. When evaluating data for
a group project, students are encouraged to have face time with
the instructors to receive feedback on their approach to solving
their chosen business problems and the corresponding analytics
techniques.
2.2.3 Step 3: Data Preparation. In the next phase, students
learn how to source clean and pertinent data from the raw data
made available to them. In this step, students are taught to use
Excel and JMP Pro for various data preparation steps, including
data cleaning and transformation for subsequent analyses.
2.2.4 Step 4: Explanatory Analysis. Students are introduced
to the foundation of inferential statistics and its applications in
real-world problems. A case-based approach to teaching makes
applied statistics more engaging to the students in the context
of solving business problems. Students learn to develop
hypotheses and apply techniques such as t-tests and one-way
ANOVA to infer potential explanations for the observed effect
in the population. This reinforces their approaches to problem-
solving using inferential statistics and developing actionable
solutions.
2.2.5 Step 5: Modeling. Next, fundamental problem-solving
techniques are introduced, such as model building using the
supervised learning approach (linear and logistic regression)
and unsupervised learning such as k-Means and hierarchical
clustering. Teaching steps followed in this process include (1)
identifying the suitable type of model, (2) building regression
models using appropriate variables in context, and (3)
validating regression model accuracy and predictability.
Students articulate and interpret these results to develop
innovative solutions for the business problem.
2.2.6 Step 6: Knowledge Evaluation. Hands-on practice of
data analytics using these techniques complements the overall
approach to problem-solving. With a holistic focus on
actionable analytics, how and why organizational data beyond
transactional data are collected is also considered, particularly
concerning the potential biases in data collection, interpretation,
and decision making. Further, we review data ethical issues,
including privacy, security, accountability, transparency, and
fairness. Towards the end of the course, infusing conceptual
knowledge of data storage techniques, big data, and AI builds
curiosity and knowledge in the context of value creation for a
business.
The proposed framework allows us to integrate the course
objectives for students to apply their learned analytics skills to
analyze real-world problems using publicly available data (e.g.,
Kaggle) in a substantive group project.
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
363
2.3 Course Learning Assessment
Bloom’s Taxonomy has weathered the test of time very well,
having been originally published in 1956 (Bloom, 1956),
revised in 2002 (Krathwohl, 2002), and widely referenced
today. It includes six levels of learning that are aspirational in
assessing the depth of learning by participants in a course.
Below, we describe how we utilized Bloom’s Taxonomy to
ensure rigorous and complete coverage of Bloom’s six levels of
objective educational achievement by the assessments in the
course.
2.3.1 Level 1: Remember. Before each session, the online
element of the hybrid mix, the conceptual foundations for the
module, is delivered via various lecture videos and articles
available to the learners; the remembering of the concepts is
assessed through an online quiz before each session throughout
the semester. It means students do lower cognitive work
(knowledge comprehension) before class.
2.3.2 Level 2: Understand. Understanding is more profound
than simple memorization and necessitates more reflection and
time to absorb the concepts thoroughly. We move beyond the
fundamentals to see the application in real-world examples and
data. We assess students’ comprehension of the subject in a
final exam.
2.3.3 Level 3: Apply. Most modules include a hands-on
component (labs) using analytics software such as JMP Pro,
Excel, and Tableau, where a significant amount of classroom
time is devoted to applying the newly learned concepts to
reinforce how organizations benefit from analytics. The labs are
assessed through weekly applied homework.
2.3.4 Level 4: Analyze. Beyond labs, assignments analyze data
from business case situations. These weekly assignments assess
critical thinking and students’ ability to apply analytical and
technical capabilities to solving a real-world problem.
2.3.5 Level 5: Evaluate. Students evaluate business cases and
make holistic recommendations. This culminates in a practical
exam where business data and problems are addressed by
learners using the entire variety of tools and techniques
presented in the course to that point.
2.3.6 Level 6: Create. Student teams investigate real-world
data and develop actionable recommendations for feedback and
evaluation to present to their peers. The learners gain significant
experience in the art of storytelling and convince their peers that
they have generated solutions of considerable value to a
business.
Research suggests a variety of learning outcomes, such as
declarative and procedural knowledge acquisition and skill
acquisition (Colquitt et al., 2000). Declarative knowledge is
often considered the “what” of a topic, the theoretical and
conceptual knowledge of that topic. Procedural knowledge can
be thought of as the “how” of a topic, as application of
declarative knowledge. Skill acquisition involves personal
mastery over the “what” and “how” of a topic.
Our current assessment framework includes regular
declarative and procedural knowledge acquisition testing,
culminating in substantial skill acquisition testing, to design a
rigorous and engaging introductory analytics course
(Marjanovic, 2012). After about two-thirds of the course, the
learning and application up to the practical exam are executed
individually. The remainder of the course is where students also
work on some of the softer skills as they work in groups to take
a more significant and more extensive data set, choose and
execute a thorough analysis using the concepts, tools, and
techniques learned to that point in the course, and make a
presentation of the findings and recommendations in the role of
business consultants. Assessment is continuous from the first
week to the last; the sequencing of the different assessment
categories is shown in Figure 2. The blue-colored parts of each
arrow indicate when a particular form of assessment was being
applied during the course. For example, the evaluation of
procedural knowledge accumulation took place from the
beginning of the course through each of the first eight weeks
and is represented by the blue arrow stretching to Week 8; skill
acquisition developed during that accumulation is assessed in
the practical exam administered around week 9 of the course,
represented by the short blue arrow.
Figure 1: Assessment of Learning for the Introductory
Analytics Course
3. COURSE IMPLEMENTATION
We begin by making a tentative plan based on our previous
discussion and looking at related issues. We detail two
implementation issues with significant course implications: a
hybrid course delivery and the unanticipated pandemic’s
effects.
3.1 Semester Schedule
The appendix presents a 15-week schedule based on the course
philosophy described in Section 2. The topics are sequenced in
the curriculum based on the modified CRISP-DM framework
and assessments discussed in the previous section. The spring
2022 schedule serves as the basis for this tentative outline
which is the culmination of several years of content re-
alignment, sequence changes, and other improvements, an
ongoing process.
The course commences with an introduction to problem-
solving using analytics and swiftly covers a broad set of
analytical techniques that reinforce the value of analytics in
decision-making. Covering these techniques during the early
part of the course equips the learners to tackle the challenges
they face in preparation for the group presentations (cases)
towards the end of the course. A sequence of learning and
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
364
relearning serves to reinforce the application of analytics that
optimizes the educational experience.
Vital support for scale, consistency, and flexibility of
implementation across multiple sections, instructors, and
campuses is secured via the choice of an appropriate learning
management system (LMS). When the course was first offered
in Fall 2017, the Business School was using the Blackboard
LMS though the wider University was using Canvas.
Ultimately, the Business School adopted Canvas, and, starting
in Fall 2019, Canvas became the LMS of choice for the
Business School and this course. While both LMSs are broadly
similar, there were also subtle differences that primarily
surfaced in Fall 2019 and were more gracefully implemented
subsequently. A feature of Canvas that revealed itself to be
essential is the use of a course “blueprint” that enabled us to
make course-related changes to the blueprint as needed and
push them to the multiple live course sites almost
instantaneously.
3.2 Hybrid Delivery
For centuries, the dominant form of learning delivery has been
the so-called “chalk and talk” or “sage on the stage” style.
Correspondence courses and distance learning have existed for
more than a century (“Distance Education,” n.d.). The rise of
the Internet has furthered delivery mode experimentation and
the implementation of viable alternatives to the conventional
“chalk and talk” style. The main modes of learning include
synchronous delivery (“live”), asynchronous delivery
(“recorded”), and various hybrid or blended versions of both.
Hybrid learning captures the benefits of both face-to-face
and online instruction and integrates in-person and online
content, based on in-person and online education best practices.
In hybrid, a substantial portion (between 30-79%) of the course
content is delivered online with fewer face-to-face meetings
(Allen & Seaman, 2016). Assessments that cover both online
and in-person activities are necessary. Despite the face-to-face
mode of delivery being generally considered the richest way to
learn (Dennis et al., 2008), prior meta-analysis suggests
students in hybrid settings had better learning among peers
when compared to face-to-face learning (Means et al., 2009).
Though that reference is highly cited with thousands of
referring papers, it may be considered a little dated (where was
Zoom then?); more recent evidence continues to report hybrid
delivery adding to learning effectiveness (Noetel et al., 2021;
Scaringella et al., 2022; Wang et al., 2022).
Our decision to develop this introductory analytics
curriculum in a hybrid format is based on applying the guiding
principles of maximizing learning, at scale, and consistently
across a range of students and instructors that we established
earlier in the course development process. We believe that a
hybrid delivery enables us to maximize outcomes on several
dimensions outlined below.
3.2.1 Students’ Experience. Such an introductory course, like
the field of analytics itself, is a recent innovation for Business
School curriculums; there was little guidance for incorporating
students’ interest and passion to inform course design and
implementation when we began this five-year odyssey.
Therefore, we focused on developing content that would appeal
to a wide range of students from a variety of academic
backgrounds. We partially achieve this goal by implementing
active learning techniques (Prince, 2004) to maintain high and
continuous engagement. Many proponents of active learning
suggest that the effectiveness of this approach depends on the
student’s attention span during the lecture, which we see
progressively diminishing because of available technological
distractions during class. At the same time, instructors with
varying experience and abilities to teach with experiential
pedagogy could find the task challenging when teaching this
course. A shared hybrid platform instantiated through the LMS
blueprint is beneficial for maintaining uniformity of students’
understanding and overall experiences across sections. It
enables us to facilitate most of the first exposure to new
conceptual learning outside the class, primarily via lecture
videos on the LMS. It then uses the class time for knowledge
assimilation and reinforcement through problem-solving,
discussions, and hands-on activities.
3.2.2 Flexibility. Adding flexibility to the course delivery
method is always a prime design consideration. Several other
flexibilities to the hybrid design that we considered are (1)
students learning in their time frame, (2) enabling different
learning styles, (3) enhancing students understanding of the
relationship between concepts and their applications in the real
world, and (4) a flipped classroom that encourages students to
engage more. In addition, the design choice harbored a
flexibility boon in disguise that only materialized when the
pandemic resulted in an unprecedented disruption to learning.
Transitioning from a hybrid delivery mode to pure synchronous
online was relatively easy and seamless, with little additional
preparatory effort when the pandemic struck. This enabled a
global engagement for the course with students from around the
globe, including across the US, Europe, Asia, and the Middle
East.
3.2.3 Learning Environment. To succeed, the course must
provide an environment where students and teachers can
discuss content, exchange ideas, debate, and share their
thoughts. A hybrid mode facilitates more engagement in the
class and, hence, more overall learning, especially for an
analytics course when forming questions about a specific
business problem and analyzing to address underlying business
challenges. Given the wide range of student preparedness and
capabilities, the preference is for the learning environment to be
“interactive and engaging,” enabling students to learn through
discovery and fun. A once-a-week meeting during the course
generally proves adequate for students to engage in a
conversation and apply the concepts they learned previously
from online modules to work by practicing real-world data
analytics in each class meeting. In addition, faculty and
teaching assistant (TA) office hours close any gap as needed.
3.2.4 Collaborative Knowledge Building. In collaborative
knowledge building, group activities are centered around
sharing responsibility for learning, distributing expertise, and
building on each other’s ideas (Hmelo-Silver & Barrows,
2008). With available technologies and applications such as
Slack, Google suite, and iClicker, students can collaborate
effectively and engage in knowledge building both in and
outside the classroom. Combining tools support and in-person
interactions multiplies the opportunities for collaborative
knowledge building. In addition, faculty can facilitate the
experience by monitoring progress and providing appropriate
feedback.
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
365
3.2.5 Teaching Efficacy. Faculty can be hesitant to teach fully
online courses (Guppy et al., 2022). The existence of an
apparent gap between students’ perceptions and expectations
from the subject and providing content on the online platform
creates uncertainty that directly affects teaching efficacy. A
hybrid design makes it easy for faculty to bridge these gaps
while bringing their analytics expertise to the classroom and
sharing a unique student experience during the weekly in-
person meeting. Students across many sections of this course
could then study the same content while experiencing a unique
teaching approach from their faculty member.
Our course adds another layer of challenges for faculty
when incoming students to this course fall in a broad spectrum
of preparation and motivation to be successful in the class. Such
diversity could potentially affect teaching efficacy reflected in
teaching evaluations. We addressed this problem with a hybrid
design choice by providing additional resources (recorded
videos, online tutoring by teaching assistants outside the
classroom) for students with more significant challenges. With
no standard textbook prescribed for the course, the faculty can
practice creativity and innovation to individualize teaching and
learning. The overall learning outcome can be positive with
proper coordination of this introductory course across various
sections and a shared course foundation that is evolved and
consistently adopted by all faculty.
3.2.6 Sustainability. Though not considered initially as part of
the design, given its increasingly pertinent nature, we also
considered sustainability. Physically meeting only once each
week can imply reducing traveling, parking, and physical space
requirements. This course is thus likely to have a lower carbon
footprint than comparable face-to-face courses. However, there
is little commentary on this topic in the literature. Given the
current interest in this topic, more research needs to be done. If
we can confirm the hybrid design is better for learning, the
sustainability of the design is a further bonus.
3.3 COVID-19 Pandemic
With the onset of the COVID-19 pandemic, the University went
from regular in-person instruction to synchronous instruction
over Zoom (www.zoom.us). This change was announced at the
start of the one-week-long spring break in 2020 and was
implemented for the remainder of the spring semester. That
style persisted through Fall 2020, Spring 2021, and Summer
2021 semesters with a return to the in-person classroom for the
Fall 2021 semester with students and faculty required to wear
masks. Almost exactly two years after the initial changes
brought about by the pandemic, over the spring break of 2022,
in-class learning was changed to mask optional, marking what
we hope is the final chapter in returning to “normal.” While the
pandemic raised significant challenges for educator
communities worldwide, we experienced minimal disruption in
the course, given the choice of a hybrid format that was already
implemented. Moreover, the pandemic provided an
environment for further refinement of the design.
To get a sense of how the pandemic changed engagement
from in-person class sessions to synchronous but remote Zoom
sessions, the map in Figure 3 shows a cross-section sample of
289 students enrolled in Spring 2021 who voluntarily shared
their remote location. Whereas an in-person class requires a
weekly physical place in the classroom, no such bound existed
while the synchronous class meetings occurred over Zoom.
Each map dot represents a cluster of students remotely logged
in from that location to complete their coursework. There were
also a few students based in Europe, none of whom participated
in the location sharing.
Weekly synchronous sessions over Zoom replaced in-class
meetings where enrolled students in each section joined from
their location from anywhere in the world at their respective
(local to the University) class section times. Each synchronous
session lasted 75 minutes (the same length as an in-class
session), was recorded via Zoom, and subsequently shared
among students. Doing so benefited many, especially those who
could not attend the live class because of a time zone or other
conflict, and where they were enabled to review the videos at a
convenient time. We performed the weekly hands-on activities
(application of data analytics techniques using Excel, JMP Pro,
and Tableau) via Zoom sessions. It was engaging as students
could ask questions immediately and in parallel for any doubts
and technical difficulties. Troubleshooting any technical
problems was greatly facilitated via the screen sharing feature
of Zoom. Although we observed a decrease in direct student-
teacher interaction before, after class, and/or around campus,
we also observed an uptick in student content interaction in the
learning management system as measured by the average time
spent on the learning management system. At mid-semester, we
proctored a practical exam during class time via Zoom as a
stand-in for doing it in a physical class setting.
Figure 3: Global Engagement by Course Students during
the Pandemic
At the beginning of the pandemic-era instruction, there was
a heightened concern among students for overall success in the
class. Over time, however, their fears subsided when they
started to engage in the classroom through attendance and
polling features available via Zoom. Students could ask
questions via Zoom’s private and public chat feature that a
fellow student in the class sometimes answered. Discussion and
debates in breakout rooms among group members were
enriching for many students. It brought an engaging atmosphere
during the course. Students could embrace this new teaching
method rather quickly because of the user-friendliness and
straightforward nature of the Zoom application. The University
had an enterprise implementation of Zoom before the pandemic
struck, and many of the faculty were already familiar with it.
This greatly facilitated the ease with which the faculty pivoted
to teaching the course over Zoom, that, in turn, helped lower
student anxiety about using Zoom.
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
366
4. ROAD FORWARD
In this section, we review the lessons learned from the
experience of providing the class to almost 10,000 students over
five years and from before, during, and after the pandemic. We
also highlight on some of the challenges one could face in the
process of developing and implementing such a curriculum in a
business school.
4.1 Lessons Learned
Instruction via Zoom is not the same as classroom instruction.
We cannot assume that students’ behavior remains the same and
that they stay focused to the same degree in a virtual room.
Recent studies found that it is difficult for students to balance
their studies with the pressure of home and work commitments
during a crisis such as the pandemic (Jankowski, 2020) and that,
despite the increasing ubiquity of online technologies, we may
not be ready for online learning (Power et al., 2022).
Understanding students’ needs is critical; showing empathy
throughout the course was essential to boost self-fulfillment
among students. A pedagogy that gives importance to students’
needs is always a winner, and our experience with this
introductory analytics course is no surprise. Online education
has found a new face for teachers to get excited about, and the
pandemic has provided a unique environment to innovate.
Having a single textbook flex to cater to the entire course’s
needs is challenging. Ultimately, we chose to develop and use
our materials which is a very intensive process though it
provided the most flexible environment during course
evolution. Another choice with a significant challenge was to
use multiple analytics tools (currently including Excel, Tableau,
and JMP Pro). While using various tools mirrors real
professional and business environments, the technical issues
inherent in doing so for a college course are significant. We
ultimately chose these three tools for their lower barriers to
getting started, shorter learning curves, and existing availability
of a site license.
To date, we have observed no significant change in
students’ performance in terms of final course grades as we
went from an in-person hybrid (Fall 2019-first half of Spring
2020) to a Zoom-based hybrid (second half of Spring 2020) and
back again to an in-person hybrid (Fall 2021). Moving from an
in-person class meeting to a Zoom meeting does not change the
synchronicity of the delivery perhaps explaining the lack of any
measurable student performance difference in terms of their
final course grades. What we do not yet understand is the effect,
if any, on longer term learning retention. Nevertheless, as with
many aspects of pandemic life, this synchronous Zoom-based
mode of instruction has shown significant potential for learning
introductory analytics, if not beyond.
4.2 Challenges and Caveats
As with any form of asynchronous learning, such as the hybrid
choice we made, a great deal of preparation is necessary to
create and deliver high quality asynchronous content and
assessments; this is a significant up-front time burden. While
the scale helps to spread those efforts, it is nonetheless
significant because preparing material for asynchronous
consumption requires greater foresight and experience than
preparing in person material as the feedback loops usually
present in person are absent for asynchronous content.
Perhaps the greatest factor in student success we have
observed is in the students’ willingness and ability to take on
the greater responsibility of working with asynchronous
content. Hybrid is still relatively new and can be daunting as a
new or unfamiliar mode of learning (Power et al., 2022).
Further, while hybrid generally comes with less face-to-face
time, some additional support is necessary to catch those
unfamiliar to the hybrid challenges or who are struggling with
the hybrid mode.
A potential challenge to this course concerns inclusiveness.
It is not common at universities to require bringing a personal
laptop into the classroom. Classes with significant technical
components, such as those discussed here, often occur in
computer labs. However, the hybrid delivery mode adopted for
this course raised a concern, mainly because many of our
institution’s students come from lower-income households. Not
having the in-person lab sessions in a computer lab, having
assessment activities (such as the practical exam) in a classroom
not equipped with computers, and requiring the students to use
their laptops meant that some proportion of the student body
could struggle to effectively participate (Deng & Sun, 2022).
With almost 10,000 students having completed the course and
laptops being so fundamental to so much college activity, we
have had only a handful of situations where this digital divide
has arisen. We comfortably handled them by having laptops
available to borrow on a short-term basis from the main campus
library. We continue to monitor this issue.
We acknowledge that the proposed course design and
delivery may not be universally applicable for several reasons.
First, a massive factor for a successful offering of our course
depends on students’ discipline and how they manage time to
get the best out of this course. At times, this could be a
challenge and distraction for faculty to manage if they are not
well equipped with technical knowhow and lack adequate class
management skills. Requiring up-to-date technology (computer
and Internet) during class and outside the classroom could pose
hurdles for some students who struggle to attend school already
challenged by poverty and inequality. Finally, while we
describe our specific tool choices, this course could be offered
with other tools; doing so may warrant modification of the
proposed pedagogical and assessment framework.
4.3 Future Work
While a substantial quantity of the previously cited analytics
curricula was designed with industrial participation, we intend
to complement that work by vetting this design from the
student’s learning process perspective. We will empirically
research the learning process, course engagement, enthusiasm
about the course, and analytics in general from the students’
perspective. Understanding students’ sentiments for this course
is also interesting in driving design improvements. We will
further leverage the relevant results from these explorations to
enhance the design, content, and assessment. Another direction
for future development is to further practice inclusiveness by
examining relevant case studies and data. As the scale of this
course continues to grow, we are investigating the use of AI-
enabled tools to assist in the grading and feedback process.
Finally, we will develop an advanced version of this course for
Honors College students.
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
367
5. CONCLUSION
We have successfully designed and delivered an introductory
analytics course for all incoming School of Business
undergraduates (more than 4,000 per academic year at this
point). The course foundation, in terms of content and course
assessments, met all design goals and curriculum guidelines for
our undergraduate programs. Integrating industry-standard
analytics tools into the curriculum complements the learning
experience and adds to students’ curiosity.
The novelty of this work is that we developed a hybrid
course on introductory analytics that we offer at scale, targeted
to all business undergraduates. Our course model has proven
easy to embrace for onboarding faculty to teach the course with
relatively minimal preparation. Also, the integration of
accessible analytics tools such as Excel, JMP Pro, and Tableau
makes this course unique for students from diverse
backgrounds. In addition, faculty coordination enables
seasoned faculty to share their valuable experience in
successive course evolution. Based on informal student
feedback, our design provides self-fulfillment in learning
analytics and serves the purpose well. Our explanation can help
other faculty in a similar situation.
Our course design and implementation have been battle
tested through the pandemic and by thousands of students; it
serves well in fulfilling our objectives of maximizing learning
across a diverse student body, comfortably handles scale, and
can be coordinated with little friction. Those looking to move
in similar directions would also be well served in considering
the caveats and challenges that arose for us in going this way.
6. ACKNOWLEDGEMENTS
We deeply appreciate the thoughtful comments and suggestions
from the editors and reviewers. We thank the leadership of the
W. P. Carey School of Business and IS Department for valuable
discussions and feedback throughout curriculum design and
implementation. The authors also acknowledge the input from
course learners, participating faculty, school administration,
technical support team, and teaching assistants during the
course development and delivery across more than 100
sections. Finally, we acknowledge support from the W. P.
Carey School of Business Dean’s Office “Research on
Teaching and Learning” Grant Program.
7. REFERENCES
AACSB. (2020). 2020 Guiding Principles and Standards.
AACSB.
Agarwal, R. B., Yong Goh, K., Ghose, A., Shmueli, G.,
Slaughter, S., & Tambe, P. (2014). Does Growing Demand
for Data Science Create New Opportunities for Information
Systems? The Thirty-Fifth International Conference on
Information Systems (ICIS) (pp. 1-7). Auckland.
Allen, I. E., & Seaman, J. (2016). Online Report Card:
Tracking Online Education in the United States. Babson
Survey Research Group and Quahog Research Group, LLC.
Bloom, B. S. (1956). Taxonomy of Educational Objectives: The
Classification of Educational Goals. Handbook 1:
Cognitive Domain. Longman.
Burch, G. F., Giambatista, R., Batchelor, J. H., Burch, J. J.,
Hoover, J. D., & Heller, N. A. (2019). A Meta-Analysis of
the Relationship between Experiential Learning and
Learning Outcomes. Decision Sciences Journal of
Innovative Education, 17(3), 239-273.
Burns, T., & Sherman, C. (2019). A Cross Collegiate Analysis
of the Curricula of Business Analytics Minor Programs.
Information Systems Education Journal, 17(4), 82-90.
Chapman, P., Clinton, J., Kerber, R., Khabaza, T., Reinartz, T.,
Shearer, C., & Wirth, R. (2000). CRISP-DM 1.0: Step-by-
Step Data Mining Guide. CRISP-DM Consortium.
Colquitt, J. A., LePine, J. A., & Noe, R. A. (2000). Toward an
Integrative Theory of Training Motivation: A Meta-
Analytic Path Analysis of 20 Years of Research. Journal of
Applied Psychology, 85(5), 679-707.
Deng, X. & Sun, R. (2022). Barriers to e-Learning During
Crisis: A Capital Theory. Journal of Information Systems
Education, 33(1), 75-86.
Dennis, A. R., Fuller, R. M., & Valacich, J. S. (2008). Media,
Tasks, and Communication Processes: A Theory of Media
Synchronicity. MIS Quarterly, 32(3), 575-600.
Dinter, B., Kollwitz, C., & Fritzsche, A. (2017). Teaching Data
Driven Innovation – Facing a Challenge for Higher
Education. The Twenty-third Americas Conference on
Information Systems (AMCIS) (pp. 1-10). Boston.
Distance Education. (n.d.). In Wikipedia.
https://en.wikipedia.org/wiki/Distance_education
Doshi, R., & Krishan, N. (2020). Winning the War for Talent:
An Enterprise Guide to Building a Sustainable Workforce
Strategy. Everest Group.
Firth, D., King, J., Koch, H., Looney, C. A., & Pavlou, P.
(2011). Addressing the Credibility Crisis in IS.
Communications of the Association for Information
Systems, 28(13).
Frost, R., Matta, V., & Kenyp, L. (2021). A System to Automate
Scaffolding and Formative Assessment While Preventing
Plagiarism: Enhancing Learning in IS and Analytics
Courses That Use Excel. Journal of Information Systems
Education, 32(4), 228-243.
Guppy, N., Verpoorten, D., Boud, D., Lin, L., Tai, J., &
Bartolic, S. (2022). The Post-COVID-19 Future of Digital
Learning in Higher Education: Views From Educators,
Students, and Other Professionals in Six Countries. British
Journal of Educational Technology, 53(6), 1750-1765.
Gupta, B., Goul, M., & Dinter, B. (2015). Business Intelligence
and Big Data in Higher Education: Status of a Multi-Year
Model Curriculum Development Effort for Business
School Undergraduates, MS Graduates, and MBAs.
Communications of the Association for Information
Systems, 36(23).
Hmelo-Silver, C. E., & Barrows, H. S. (2008). Facilitating
Collaborative Knowledge Building. Cognition and
Instruction, 26, 48-94.
Jaggia, S., Kelly, A., Lertwachara, K., & Chen, L. (2020).
Applying the CRISP-DM Framework for Teaching
Business Analytics. Decision Sciences Journal of
Innovative Education, 18(4), 612-634.
Jankowski, N. A. (2020). Assessment during a Crisis:
Responding to a Global Pandemic. The National Institute
for Learning Outcomes Assessment (NILOA).
https://www.learningoutcomesassessment.org/wp-
content/uploads/2020/08/2020-COVID-Survey
Krathwohl, D. R. (2002). A Revision of Bloom’s Taxonomy:
An Overview. Theory Into Practice, 41(4), 212-264.
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
368
Mann, L., Chang, R., Chandrasekaran, S., Coddington, A.,
Daniel, S., Cook, E., Crossin, E., Cosson, B., Turner, J.,
Mazzurco, A., & Dohaney, J. (2021). From Problem-Based
Learning to Practice-Based Education: A Framework for
Shaping Future Engineers. European Journal of
Engineering Education, 46(1), 27-47.
Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R.,
Roxburgh, C., & Byers, A. H. (2011). Big Data: The Next
Frontier for Innovation, Competition, and Productivity.
McKinsey Global Institute.
Marjanovic, O. (2012). Using the Revised Bloom’s Taxonomy
to Scaffold Student Learning in Business
Intelligence/Business Analytics. European Conference on
Information Systems Proceedings. Barcelona, Spain.
Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K.
(2009). Evaluation of Evidence-Based Practices in Online
Learning: A Meta-Analysis and Review of Online Learning
Studies. U.S. Department of Education.
Muller, N. M., & Seufert, T. (2018). Effects of Self-Regulation
Prompts in Hypermedia Learning on Learning Performance
and Self-Efficacy. Learning and Instruction, 58, 1-11.
Noetel, M., Griffith, S., Delaney, O., Sanders, T., Parker, P., del
Pozo Cruz, B., & Lonsdale, C. (2021). Video Improves
Learning in Higher Education: A Systematic Review.
Review of Educational Research, 91(2), 204-236.
Paul, J. A., & MacDonald, L. (2020). Analytics Curriculum for
Undergraduate and Graduate Students. Decision Sciences
Journal of Innovative Education, 18(1), 22-58.
Power, J., Conway, P., Ó Gallchóir, C., Young, A.-M., &
Hayes, M. (2022). Illusions of Online Readiness: The
Counter-Intuitive Impact of Rapid Immersion in Digital
Learning Due to COVID-19. Irish Educational Studies.
Prince, M. (2004). Does Active Learning Work? A Review of
the Research. Journal of Engineering Education, 93(3),
223-231.
Rodammer, F., Speier-Pero, C., & Haan, J. (2015). The
Integration of Business Analytics into a Business College
Undergraduate Curriculum. The Twenty-First Americas
Conference on Information Systems (AMCIS) (pp. 1-9).
Puerto Rico.
Scaringella, L., Górska, A., Calderond, D., & Benitez, J. (2022).
Should We Teach in Hybrid Mode or Fully Online? A
Theory and Empirical Investigation on the Service-Profit
Chain in MBAs. Information & Management, 59(1).
Schiller, S., Goul, M., Iyer, L. S., Sharda, R., & Schrader, D.
(2015). Build Your Dream (Not Just Big) Analytics
Program. Communications of the Association for
Information Systems, 37(40).
Topi, H., Valacich, J. S., Wright, R. T., Kaiser, K. M.,
Nunamaker, J. F., Sipior, J. C., & de Vreede, G. J. (2010).
Curriculum Guidelines for Undergraduate Degree
Programs in Information Systems. ACM & AIS.
Wang, S., Griffiths, R., Christensen, C., D’Angelo, C., &
Condon, K. (2022). An Evaluation of a First-of-Its-Kind
Hybrid Law Degree Program. Journal of Computing in
Higher Education, 34, 517-544.
Wilder, C. R., & Ozgur, C. O. (2015). Business Analytics
Curriculum for Undergraduate Majors. INFORMS
Transactions on Education, 15(2), 180-187.
Williams, B., & Elmore, R. (2021). Teaching Business
Analytics during the COVID-19 Pandemic: A Tale of Two
Courses. Communications of the Association for
Information Systems, 48(1).
Wixom, B., Ariyachandra, T., Douglas, D., Goul, M., Gupta,
B., Iyer, L., Kulkarni, U., Mooney, J. G., Phillips-Wren, G.,
& Turetken, O. (2014). The Current State of Business
Intelligence in Academia: The Arrival of Big Data.
Communications of the Association for Information
Systems, 34, 1-13.
Zadeh, H. A., Schiller, S., Duffy, K., & Williams, J. (2018). Big
Data and the Commoditization of Analytics: Engaging
First-Year Business Students with Analytics. e-Journal of
Business Education & Scholarship of Teaching, 12(1), 120-
137.
Zhang, L., Chen, F., & Wei, W. (2020). Teaching Tip: A
Foundation Course in Business Analytics: Design and
Implementation at Two Universities. Journal of
Information Systems Education, 31(4), 244-259.
AUTHOR BIOGRAPHIES
David P. Darcy received his Ph.D. in information systems from
the University of Pittsburgh’s Joseph
M. Katz’s Graduate School of
Business. He is clinical associate
professor in the Department of
Operations and Decision
Technologies, Kelley School of
Business, Indiana University (IU).
Before joining IU, he has been
Faculty at the University College
Dublin’s Business Schools, University of Maryland’s Robert H.
Smith School of Business, The Irish Management Institute,
Florida International University’s College of Business, and
Arizona State University’s W. P. Carey School of Business. His
research in IT project management has been published in IEEE
Transactions on Software Engineering, Statistical Science, and
IEEE Software, among others. His current research interests are
in designing, developing, and validating engaging analytics
content and courses.
Asish Satpathy (Ph.D., MBA) is senior faculty at the
Department of Information Systems
at the W. P. Carey School of
Business, Arizona State University
(ASU), and has spent years in
graduate and undergraduate
teaching, academic research, and
innovative curriculum development
in data mining and location analytics.
His research has been published in
high-impact scholarly journals such as Annals of GIS,
Managerial Auditing Journal, Nuclear Instrumentation Method
Journal A, Physical Review D, Physical Review Letters, and
Physics Letters B, among others. In addition, he has won several
teaching awards, founded, and consulted with several tech
startups, and presented workshops and boot camps on location
analytics at various business schools. He is currently chairing
the committee to oversee the data analytics certificate programs
offered by the Department of Information Systems at ASU.
Journal of Information Systems Education, 34(4), 360-369, Fall 2023
369
APPENDIX
Introductory Analytics Course Schedule
Learning Objective Module Quiz Deliverables
[Intro] Problem Solving & Actionable
Analytics
Excel refresher
Performing Analysis
& Finding Insights
[Science] Science of Analytics Science of Analytics Advanced Excel
[Visualization]
Data Visualization &
Interpretation
Data Visualization &
Interpretation
Visualization
[Descriptive] Descriptive Statistics Descriptive Statistics Descriptive statistics
[Inferential] Inferential Statistics Inferential Statistics T-tests & ANOVA
[Regression] Supervised Data Mining Supervised Data Mining Linear regression
[Logistic] Logistic Regression Logistic Regression Logistic regression
Case 1 Preview
[Clustering] Unsupervised Data Mining Unsupervised Data
Mining
Clustering;
Case 2 Preview
Generating,
Organizing & Storing
Data
[Transformation] Data Transformation Data Practical Exam Review;
Case 3 Preview
[Practical] Practical Exam In class exam
[Architecture]
Data & Information
Architecture
Data & Information
Architecture
Case Review
[Organizations] Experimental Design Experimental Design Case 1
Data Collection
Strategy
[Biases] Biases & Ethics Biases & Ethics Case 2
Analytics in Business [AI&ML] AI & Machine Learning Machine learning Case 3
[Review] Course Review Final exam review
[Final] Final Exam
Copyright of Journal of Information Systems Education is the property of Information
Systems & Computing Academic Professionals (ISCAP) and its content may not be copied or
emailed to multiple sites or posted to a listserv without the copyright holder’s express written
permission. However, users may print, download, or email articles for individual use.