In their survey of literature on the adoption of learning technologies, Liu, Geertshuis, and Grainger observe that “institutions attempt to promote learning technologies without regard for the recognised and evidenced practices in innovation adoption and change management” (4.3). The research below — and the preceding guide which has derived from it — attempts to address this gap by drawing heavily on scholarship from the discipline of Organization Development (OD), including research into change management and navigating organizational power structures to advocate for and generate support. OD principles are supported and supplemented by scholarship from Cognitive Science (CS), Learning Science (LS) and Learning Technologies (LT). Support pulled specifically from these fields includes coverage of the pedagogical drivers of choosing a technology and information on the application of trauma-informed pedagogy to technology implementation. 

 

Sancho-Gil and colleagues (2018) evaluated a variety of educational technology initiatives across the past several decades, ranging in scope from government-led efforts to school-level endeavors. They concluded that the vast majority of these initiatives failed both in achieving long-term, consistent usage and in meeting their intended learning goals. This consistent lack of long-term success in the implementation of educational technology was systemic across organizations and organizational levels. Given these observations, it is unlikely that the failures are due to the shortcomings of any particular technological tool in its own right, but rather to a larger structural flaw in how educational technology is implemented.

While the excellent work of Gil and associates focuses primarily on their belief that the underlying cause of failure in ed tech implementation is an oversimplified understanding of technology in a commercially-driven culture — a stance I do not inherently dispute — this guide aligns primarily with Johnson and colleagues (2016), Liu, Geertshuis, and Grainger (July 2020), Cilsalar Sagnak and Banak (2021), and many others who have illuminated the crucial importance of cultural and organizational factors in technology implementation efforts. I propose that addressing these factors will make a marked impact on the effectiveness of technological implementation in higher educational settings.

Strasser and colleagues (1981) define organizational effectiveness as “the degree to which organizations are attaining all the purposes they are supposed to” (p. 70).  I would argue that instructional effectiveness can be similarly defined as the degree to which a course is attaining all the purposes for which it is undertaken. These purposes may vary from explicit course outcomes to the fostering of personal growth or disciplinary passion, but all the tools upon which an instructor draws in creating and teaching their course should be measurably furthering those purposes.

Research indicates, however, that this is frequently not the case, in part because of “barriers related to teachers’ beliefs about instructional technology, preferred teaching methodologies, and willingness to make changes to classroom practices” (Ertmer, 2008, p. 247). Equally significant is “how organisational structure and culture shape innovation adoption” (Liu, et. al., 2020, p. 6) and the reality that someone’s “ability to cope with change is a function of both micro (individual) and macro (contextual) factors” (Harden, et. al., 2021).

Liu and colleagues (2020) surface key characteristics of instructional technology adoption, creating a system of influences that will drive the success or failure of the endeavor, including facilitative leadership, attitudes toward change, prioritization of research, and support provided around academic development. This system aligns extremely well with principles of Organizational Development, including theories of planning change (Cummings & Worley, 2015); theories of  managing change (Harden, et. al., 2021); organization process interventions (Cummings & Worley, 2015); and even the necessity for continuous change and its implications for learning organizations adopting technology (Hanelt et. al., 2020). This guide proposes that an intentional and consistent application of Organization Development principles can empower instructors to more effectively introduce digital learning tools into their teaching practices.

Given “that current educational practices need to prepare students to thrive in an ever changing technological society” (ChanLin, 2007, p. 45), there is need for a guide to technological change management and implementation that will provide instructors with practical strategies intended to overcome known challenges and barriers to using learning technologies.

Planning: Research

It is largely unnecessary to make a case for the importance of planning. Instructors are well-accustomed to mapping out their course syllabi and curating activities and content and it stands to reason that the addition of a new technology would only increase the necessity for preparation. Given that there is an entire burgeoning professional field focused largely on facilitating the use of various technologies for learning — instructional design — an instructor implementing a digital tool without that support will need to spend significant time considering how best to proceed.

Your Institution: Research

Liu and colleagues (2020) observe “how organisational structure and culture shape innovation adoption” (3.5). Their research points to the fact that the organization — for these purposes the academic department or institution as a whole — can positively and negatively impact the success of implementing educational technology by adjusting their levels of communication, clarity, and support, as well as by actively striving to create a positive and change-forward culture. As a result, they “recommend that efforts to support staff adoption should recognise the emotional and attitudinal consequences of introducing new technologies, be informed by best practices in learning design and change management, recognise diversity in ways technologies can be used by academics, and gather and share hard evidence to guide individual and collective decision making” (4.4).

In part, the relative importance of the institution in the success of the endeavor is discouraging, given the minimal influence many individual instructors have on the decisions and norms that will so directly impact their individual success or failure. Given the context of this research, however, there are two valuable reasons to convey information about institutional-level approaches to improving technology implementation. The first is to inform instructors who may be involved in the process of technology selection, so that they can help to influence and improve the process at a higher level. This would be an indication that the institution is already employing some of the best practices, given the persistent importance in the literature of engaging those impacted in the decision making process (Lewin, 1947; de Koster, Volman, & Kuiper, 2017; Hussain et. al., 2018; Tsoukas & Chia, 2002; Liu et. al., 2020). The second is to empower instructors to navigate without — and to advocate for — institutional support.

Participating in Technology Selection: Research

As Hussain and colleagues (2018) note, involving employees is the “oldest and [most] effective strategy in formulating the planning and implementing [of] change” (p. 124). There are any number of models for planned change, from Lewin (1947) to Tsoukas and Chia (2002), and they all recommend a component of employee involvement in the process. In the case of higher education, the employee directly correlates with individual instructors while management very neatly aligns with the department or school (what this guide has termed the institution). It is clear, then, that the institution should be involving instructors in the process of selecting and implementing educational technology as much as is feasible.

Cummings and Worley (2015) note four significant elements of involvement (p. 376). One of those, rewards, touches on the extrinsic and intrinsic motivations of employees and how they might be leveraged to reinforce the positive aspects of employee involvement. While there are things the administration could do to reward employees for their participation in the technology selection and implementation process, it does not feel like a best practice to advocate for them from the position of an instructor and therefore the guide does not address this element.

The other three key elements are power, information, and knowledge and skills. Power acknowledges that employees must be given enough authority to influence the relevant decisions (even if only by sharing their opinions), or their involvement is not real and therefore not beneficial from a change management perspective. Information stipulates that employees be given timely access to any information necessary to make informed decisions. Finally, knowledge and skills underscores the need to provide the necessary training and development employees need to make informed decisions. This differs from information in that it is an applied understanding. For instance, an instructor may be aware of the technical specifications of a particular learning tool, but they may not understand the implications of those specifications if they have not received the necessary training.

Tian and Zhai (2019) support Cummings and Worley, concluding that, “Employee involvement in decision-making that provides an opportunity for knowledge interactions and collective learning helps an organization grow stronger, whereas employee involvement in decision-making that fails to provide such an opportunity leads to poor organizational performance.” In order for employee involvement to be valuable, it must also be effectively supported. This means that there is still a vital role for managers/the institution to play in these decisions. Tian and Zhai (2019) also found that the best outcomes resulted from decision-making collaborations between managers and employees. Extrapolating from this proven organization development research, instructors can make a clear case for their involvement in selecting educational technology.

As participants in that process, instructors should “recognise diversity in ways technologies can be used by academics” (Liu et.al., 2020, 4.4) — and potentially share that understanding with the administration and any other decision-makers. Liu and colleagues (2020) observed that different typologies of adopters (3.4.1) with differing pedagogical beliefs and practices (3.4.4) will inevitably have different experiences with the same digital tool. This is to be expected and is in actuality a strength, allowing for different users to adapt the tool to their particular needs and approach. Instructors participating in technology adoption should be wary of assuming that all instructors will use a tool the same way, a caution that scales with the relative complexity of the tool in question. It is also worth noting that Liu and colleagues’ observations extend to students as well: their differing experience levels and interests will inevitably influence their use of the tool. Instructors should consider multiple use-cases or personas (see “User Personas”) when exploring potential educational technology.

 

Instructors who are participating in the decision-making process should be aware that their participation comes with organizational obligations as well. Participants are now a part of the change process and are subject to the literature and best practices associated with being on the implementation end of change (as opposed to the receiving end). Liu and colleagues (2020) note “[t]he influential role of champions and middle managers” in the ultimate success of technology adoption (4.3).  Cummings and Worley (2015) additionally observe the importance of “internal commitment,” which is to say that if initiatives “are to result in meaningful changes, management, staff, and other relevant members must be committed to carrying them out” (p. 163). By agreeing to be a part of the decision-making team, individual instructors are tacitly or explicitly accepting a responsibility to foster the success of the initiative, which necessitates supporting the change even if it might not be their first preference, except in cases where it is morally — rather than preferentially — questionable.

Being Assigned Technology: Research

From information to timelines to explicit support structures, there are a number of things that the administration should provide if they are requiring instructors to use a particular tool.  It may be necessary, however, for instructors to prompt the administration in these matters because, as Liu and colleagues (2019) observed, “Higher education institutions have expertise in leadership, management, change, disciplinary requirements, academic development, learning and teaching. All these perspectives could inform the adoption but they do not seem to have been utilised” in the interventions described across the 131 studies that made up Liu and colleagues’ research (4.3).

A good place to begin is by seeking information. Harden and associates (2021) remind us that “[i]nherent in the definition of coping with change is a reluctance on the part of employees to enter situations where the inputs, processes or solutions may be vague or undefined” (p. 145). Therefore, “if institutional leaders have evidence that a technology saves time [or offers some other measurable benefit] and demonstrate this relative advantage, they should speak directly to the voiced concerns of many academics” (Liu, 2019, 4.3). If they do not, instructors should not be shy about asking for whatever data is available.

The International Society for Technology in Education (2022) lists “Learner” as the first standard of excellence for educators, noting that educators need to be continual learners with regard to technology and the associated scholarship. This learning and exploration can take the form of formal training or it may be as casual as spending a few designated hours exploring the tool. Once instructors have a reasonable understanding of the tool itself, they should also consider the larger context in which it will be used. McClaren and associates (2022) establish the importance of instructional context in the efficacy and experience of using learning technologies.

As a final note, it is worth acknowledging explicitly the challenges associated with using educational technology that an instructor had no part in selecting. Feelings of uncertainty and stress are valid and can often be compounded by limited support on the part of the institution. With this acknowledgement, however, comes the understanding that it is still important for instructors to keep an open mind as they begin the process of engaging with the digital tool. Tiffan (2010) speaks to the importance of mindset in navigating change, particularly in positions of leadership. While the instructor may not be in a decision-making role institutionally, they are still the leader of their individual course and carry the associated obligations to help their students to navigate the changes. Approaching new technology with an open mind provides the best chance for success — for instructors and for students.

Ask for Support: Research

Grant (2018) reminds us of the sometimes uncomfortable importance of explicitly asking for necessary resources in the workplace, noting that it is a vital component of professional success. Grant also notes that the simple fact of asking and receiving those resources — when done appropriately — can actually improve inter-group relationships.  Most important for this context, however, is that instructors who ask for the resources and support that they need will be more likely to succeed in implementing their new technology.

Institutions should keep in mind that “extra resources are needed for such change activities as training, consultation, data collection and feedback, and special meetings. Extra resources are also helpful to provide a buffer as performance may drop during the transition period” (Cummings & Worley, 2015, p. 200).

In addition to the support instructors might expect to receive from their institutions, it is also important to consider the benefits of peer-to-peer engagement as well. The organization development concept of Team-Member Exchange is immediately relevant here. This concept relies on instructors “receiving support, encouragement and assistance from other team members, thus reducing stress and increasing ability to cope with change. ​​There is evidence to suggest that those individuals with higher levels of social support are more inclined to respond to change positively” (Harden et. al., 2021, p. 147).

Finally, instructors should consider community or governmental resources to support their efforts, particularly in addressing issues of student access. Johnson and colleagues (2016) recommend that instructors “obtain funds for resources via non-traditional sources (e.g., crowdfunding, grants)” (p. 22). Many local governments and community organizations have programming already in place that may support such instructor efforts. Federal support from the Office of Educational Technology may also be a good fit: https://tech.ed.gov/funding/

Point to the Literature: Research

“Organizational changes result in situations where employees are uncertain about their ability to adjust, which creates stress for employees” (Harden et. al., 2021, p. 145). This is stress that can be avoided by drawing on best practices from the literature. Drawing on research such as Harden and colleagues’ article — as well as the plethora of scholarship throughout this appendix of supporting research — can help to guide instructors to develop successful digital practices and can form the basis for strong, compelling arguments for additional support.

Pick Your Battles: Research

While self-advocacy is important for instructors — as is advocacy on behalf of students — Badaracco observes that effective leaders “don’t waste political capital on fights they can’t win; they save it for occasions when they really want to fight” (2001). This is equally true for anyone navigating the workplace, including instructors. No matter how well-run or well-intentioned, every institution will have numerous issues still needing to be addressed. Instructors have enough to manage without engaging on every potential flaw with a digital tool or how it is rolled out. Instead, they should preserve their energy and resources to advocate for the most important issues, or the areas in which they are most likely to effect change. “Conflict is healthful only when people’s energies are pointed in the right direction and when carried out in a productive way” (Joni and Beyer, 2009).

Your Course: Research

Even as new educational technologies are explored and implemented, it is important for instructors to continue to draw on their own expertise and to avoid subsuming that expertise to the desire to ‘make something work’ with a digital tool. As Rebecca Chiyoko Itow (2020) observes,

Pedagogical tools, management strategies, and curricular designs certainly help teachers transform conventional curricula for online learning spaces. However, using them does not require learning to teach all over again. In fact, it must not mean learning to teach all over again. Teachers can and must rely on past experiences and professional expertise to reconceptualize what learning might “look like” in an online class context.                                                                             (para. 2)

This is usefully borne out in frameworks such as the SAMR model. While it was initially created to guide digital pedagogies in the K-12 classroom, Dr. Ruben Puentedura’s SAMR model (2010) is a valuable tool for considering technology integration in higher education as well. It proposes four key stages: substitution, augmentation, modification, and redefinition. Each stage incorporates a different approach to transforming a particular assignment from print/in-person to digital. While the latter stages of modification and redefinition may be considered more advanced approaches, there is room in every course for each stage and approach to integration.

Similarly, the Technological Pedagogical Content Knowledge framework (TPACK), seminally elucidated by Mishra and Koehler (2006), encourages instructors to explore the intersections of their existing content and pedagogical knowledge with their burgeoning technological abilities.

Each of these frameworks would be a valuable starting place for instructors incorporating educational technology into their teaching for the first time, as they encourage instructors to recognize and build on their existing knowledge and skills, even as they reimagine their course materials for a digital context.

Inventory Your Course: Research

As in the “Your Course: Research” section, both the SAMR model (Puentedura, 2010) and the TPACK framework (Mishra & Koehler, 2006) are useful in evaluating and adapting an existing course to integrate educational technology.

In addition, instructors should consider limiting the initial magnitude of change (Cummings & Worley, 2015, p. 30) and selecting only a single activity type or recurring course element with which to employ the tool. By incorporating the technology systematically over time, changes are more likely to be manageable for the instructor and accepted by the students. While there will be new students each term, for whom the tool may be completely new, the instructor will have already established best practices and will have past models and experiences to draw on as they continue to deepen their usage.

Design for Instructor Presence: Research

When applying organization development principles to an educational context, the instructor is taking on many of the functions of the manager/supervisor. Pierce and colleagues (2002) observe the importance of the manager’s role in facilitating change, noting that successful change depends on their communication, support, incentivization, and active participation in the change. At the same time, Morgan and Zeffane (2003) illustrate the importance of transparency and trust in management’s dealings with employees. These same characteristics are an important part of the instructor’s role in facilitating change for students and can be usefully mapped onto the existing scholarship around designing for instructor presence in educational literature.

Much has been written on the value of instructor and social presence since Garrison, Anderson, & Archer (2000) introduced the Communities of Inquiry framework with these as two of the foundational elements. Garrett Dickers, Whiteside, and Lewis (2012) propose a useful model for building social presence in online learning that is inclusive of both the student and instructor facets. Their approach considers key elements of presence, three of which are highlighted below: Community Cohesion, Interaction Intensity, and Instructor Involvement. As you are considering ways to integrate presence, each of these offers a useful consideration.

First, Community Cohesion, which the authors define as “the extent to which participants feel like a community” (p. 23). As you are building, consider whether you are designing an activity to support individual presence or collective presence. Both are important for a well-balanced course, and both can be fostered through both the structure of the overall course and the discrete opportunities for engagement that you build into it.  Second, Interaction Intensity reminds us that not every moment of instructor (or student) presence needs to have the same intensity.  Varying the intensity level creates a more natural, sustainable set of interactions.  Finally, Instructor Involvement.  This is exactly what it sounds like, but it is worth underscoring the value of building means of connection and communication into the usage of digital tools. This obviates the tendency to view the technology as ‘faceless’ and helps students to feel that instructor support is active rather than passive.

If we consider earlier, lecture-based modes of teaching, instructor presence is in many ways the original personality force in a classroom. In spite of a lot of highly necessary movement away from the pedagogue approach, Lowenthal and Dunlap (2018) found that “students are more interested in connecting with their instructor than their peers” (p. 281). There may be a variety of reasons for this, including a desire to better understand the person granting their grades, but with this in mind, instructors should consider putting a particular focus on instructor presence in the very early parts of the course, so that students are able to feel an immediate connection that they can draw on as the course continues.

A final recommendation regarding instructor presence draws on a best practice of “making a road map for change, citing specific activities or events that must occur if the transition is to be successful” (Cummings & Worley, 2015, p. 196). If instructors create an explicit plan for building their presence into the course, it will reduce the effort required later in the term and will ensure that they feel as ‘present’ to students toward the end of the course as they do in the beginning.

Design for Student Presence: Research

The sociocultural theories of learning developed by Vygotsky in 1978 recognize that the “role of culturally developed sign systems such as language, technology tools, and artifacts…are seen as tools for thinking and the construction of socially shared meanings” (Gunawardena, Frechette, & Layne, 2019, p. 28).  Which is to say that educational technology can provide each student with a space to bring their language, their learning tools and practices, and the artifacts of their individualized perspectives into the course to develop shared meaning in a space that is collaboratively and diversely co-created.

It’s important to note that there are two significant aspects to student presence: students feeling seen by (or present to) the instructor and students feeling seen by and connected to fellow students. When considering ways to incorporate this element into your course, Lowenthal and Dunlap (2018) remind us “that different students have different social presence needs and that students respond differently to different social presence techniques” (p. 283). With this in mind, you may want to consider the advice in “User Personas” and how instructor efforts at inviting student presence may be received by various subsets of the class.

Where the tool is being integrated on a larger scale, instructors should consider ways to encourage student presence by “decentralizing implementation” and empowering students to explore the functionality in their own right (“local self-design”) and propose activities or create and share learning modules (Cummings & Worley, 2015, p. 197). This will allow students to benefit from “peer-to-peer knowledge sharing” as outlined in Zhuge (2002, p. 28). This approach is likely to accelerate the rate at which students become comfortable with the tool and ties in nicely with educational practices of collaborative learning, allowing “a social, participatory approach to interaction that reflects participants’ diverse backgrounds, talents, and learning preferences” (Gunawardena et.al., 2019, p. 8).

This collaborative learning approach can be expanded to include a number of useful principles across both organization development and education. Much as it was relevant in “Ask for Support: Research,” the notion of peer-to-peer support or Team Member Exchange comes into play with students as well. This concept creates avenues for students to “receiv[e] support, encouragement and assistance from other team members, thus reducing stress and increasing ability to cope with change. ​​There is evidence to suggest that those individuals with higher levels of social support are more inclined to respond to change positively” (Harden et. al., 2021, p. 147). This notion of Team Member exchange, when encouraged to flourish, can eventually develop into a fully-fledged online learning community (which would exist both as a component of — and in parallel to — the learning community that is your class as a whole if much of your coursework is done in person).

Garrison and associates (2000) offer a useful closing reminder on the topic of student presence within a digital space:

“We do not believe that the effect of media per se is the most salient factor in determining the degree of social presence that participants develop and share through the mediated discourse. Rather, the communication context created through familiarity, skills, motivation, organizational commitment, activities, and length of time in using the media directly influence the social presence that develops.”                                                                                                                      (pp. 94–95)

Disaster-Proofing: Research

Dennen (2020, March 16) provides a simple and useful summation of how to approach teaching during times of emergency: “People first. Content second. Technology third.” This does not mean that instructors should avoid technology, but best practices fall by the wayside and dealing with tech issues simply cannot take time that needs to be spent on the basic reality of trying to learn while the world feels out of control. Hodges and colleagues (2020) usefully outline crucial differences between typical online or digital instruction and the emergency remote teaching so many were forced to rapidly adopt in the early days of the COVID-19 pandemic.

Disaster-proofing a course acknowledges these differences and leverages the benefits of hindsight to design a course that optimizes flexibility and minimizes course-related stress. These strategies, while inspired by early pandemic responses, are not contained in relevance to global catastrophes. Barrios (2020) observes the need to account for seasonal power outages in implementing learning technologies into his course while the growing awareness of the technology gap and the rise of trauma-informed pedagogies points to scholarly acknowledgement of the blockers and struggles that students face on a daily basis.

Preparing in advance allows instructors to take Dennen’s advice to heart, deprioritizing the simple mechanics of the course and focusing on getting themselves and their students through the situation.

For additional suggestions for disaster-proofing with learning technologies, see “Appendix: Adopting Technology and Emergency Remote Teaching.”

Introducing: Research

In organization development literature, many key tasks fall to managers involved in the change, who are called upon to support their employees through the process by preparing them for the change, providing resources for the change, listening to and addressing concerns about the change, and finally by evaluating the effectiveness of the change in order to plan for the future. In an higher education context, the instructor steps into this managerial role in order to guide their students through the implementation of the new technology. The sections that follow explore strategies for fulfilling this change management role.

Explain Your Choice: Research

Harden and colleagues (2021) address the uncertainty and stress that are caused by change — and which can arise in students if they are trying to adapt to a new technology at the same time they are trying to learn new course material. To reiterate a quote from Harden and colleagues (2021) that appears elsewhere in this guide, “[i]nherent in the definition of coping with change is a reluctance on the part of employees to enter situations where the inputs, processes or solutions may be vague or undefined” (p. 145). This uncertainty can be mitigated by providing explicit information on how the technology functions and why it is being implemented in the course.

Acknowledge the Challenge: Research

Cummings and Worley (2015) note the importance of empathy and support in the change management process, observing that this requires “a willingness to suspend judgment and to see the situation from another’s perspective” (p. 183). By being up front about the challenges associated with the change, instructors create a situation in which students “feel that those…who are responsible for managing change are genuinely interested in their feelings and perceptions,” which leads to a reduction in resistance and “helps establish the basis for the kind of joint problem solving needed to overcome barriers to change” (p. 183). This sort of active listening will also help to provide useful feedback and data that will be valuable in the validation and reflection stages.

Cummings and Worley (2015) note the usefulness of rewards as a motivating element to reinforce the positive aspects of the change being implemented (p. 376). In the context of an instructor’s course, that may be providing written praise that fosters intrinsic motivation or it may be providing a small number of extra points for utilizing certain features of the tool. Providing rewards for positive experiences will also help to de-emphasize any negative experiences and reinforce habits of persistence in students.

Leverage Existing Resources: Research

Organization development has its origins in manufacturing, which means that efficiency has been a core principle from the beginning. Processes such as Six Sigma, developed by engineer Bill Smith and pioneered at Motorola, or the Lean Methodology (developed by Sakichi Toyoda and pioneered at Toyota) have since been applied to everything from software development to toy production. There may be useful principles to be applied from those processes, but it is the core value that is most important in an educational context: efficiency.

If resources supporting the adoption and use of a new digital tool exist and are available to instructors, then they should be used. There is simply no benefit to duplicating the effort that went into creating these resources, just as there is no benefit in starting from scratch with a tool when there is a colleague who has used it previously. Instructors should embrace the principle of efficiency and make use of whatever support is available to them.

You Are Not Tech Support: Research

While the context calls for this to be a separate heading in the guide because it is such a common pitfall of introducing learning technology to a course, the concepts underpinning it are identical to “Leverage Existing Resources: Research.” Instructors have limited bandwidth and as much as possible should err on the side of efficiency in sending students to the product’s tech support (or the school’s IT department) rather than trying to solve every minor issue for students. Keeping an eye on the situation in order to step in as needed requires significantly less energy than attempting to provide tech support for, while teaching with, an instructional tool.

Be Positive: Research

It can be tempting, especially when things are frustrating, for instructors to share their negative experiences with students — and indeed, it is better not to be disingenuous when things are going poorly. The U.S. Bureau of Labor Statistics reported, however, “that negativity costs businesses $3 billion a year due to its harmful effects” (Flanagan, 2019).  Oettl et al (2018) note that “Change Management (CM) requires thorough planning and an execution attitude that consists of positivity and endurance” (173). Flanagan (2019) agrees, observing that embracing a positive mindset does not mean one has to ignore or eliminate problems, but instead “to turn justified complaints into positive solutions.” As much as possible, instructors should embrace a positive mindset and be solutions-focused regarding the inevitable problems that arise.

Utilizing: Research

Cummings and Worley (2015) usefully observe that “[t]he initial excitement and activity of changing often dissipate in the face of practical problems of trying to learn new ways of operating” (pp. 197, 199). This is a valuable reminder to instructors that there is more to implementing a new technology than introducing it.

Using the Technology Consistently: Research

Liu and colleagues (2020) “recommend that future research [on the topic of educational technology adoption] recognises that adoption is multidimensional, occurs over time and has anticipated and unanticipated consequences” (4.1). As a result of those unanticipated consequences, there is no way to know everything about the tool in an instructor’s particular context until it has actually been implemented. This quote underscores the importance of using a digital tool more than a single time over the course of a term and using it responsively — which is to say adapting the usage in light of lessons learned along the way. Cummings and Worley (2015) note that “perfectly good change projects often are abandoned when questions are raised about short-term performance” and that often those implementing the change “do not keep focused on a change because they want to implement the next big idea that comes along” (p. 201). It is important that instructors implement the technology sufficiently (both to an appropriate depth and for an appropriate amount of time) in order to properly evaluate its effectiveness.

Checking In: Research

While it is valuable for instructors to provide information to students regarding the technology and its implementation, Simoes and Esposito (2012) observe that “communication strategies whose role or function is to refine and align change can be more successful” (p. 4). Which is to say that communication needs to move from instructor to students and from students to instructors. This is facilitated by scheduling planned check-ins with students on a relevant cadence.

Kolb (1984) among many, many others have established the importance of reflection to the process of learning, providing a valuable opportunity for instructors to blend useful feedback on the technology with productive reflection for each individual learner.

Making Adjustments: Research

It is worth revisiting a point raised by Liu and colleagues (2020) that was also applicable to “Using the Technology Consistently: Research”: the need to recognize “that adoption is multidimensional, occurs over time and has anticipated and unanticipated consequences” (4.1). Because of those unanticipated consequences, it is important for instructors to be prepared to make adjustments to their usage of educational technology. Cummings and Worley (2015) agree, reminding us of the importance of reflecting on the changes that were implemented and adjusting or redesigning to account for the new information (197).

Hitting Your Stride: Research

If, after a new technology has been adopted, that technology “changes again too quickly or…before it is fully implemented, the desired results may never materialize” (Cummings & Worley, 2015, p. 201). As a result, instructors should avoid introducing additional new tools until it is clear that students have adjusted and until appropriate data has been collected on the effectiveness of the original tool.

As adjustments are being made, it is important to continue to motivate students through the change. Fisher (1995) observes the beneficial impact of rewards on employee behavior, noting the ways that desired behavior — which in this case would be engagement with the technology and the associated content — can be successfully fostered through the thoughtful implementation of a rewards system.  For an instructor who is interested in fostering intrinsic as well as extrinsic motivations among learners, this may take a variety of forms, from points to praise.

Another way to motivate students is for instructors to share any immediately available data on successes associated with usage of the tool. When students, like professionals, are provided with “timely and supportive feedback on new behaviors, their ability to learn more quickly increases” (Cummings & Worley, 2015, 197).

Validating: Research

Evaluation of the effectiveness of a change is always a key part of implementing changes following organization development principles and scholars studying learning agree: “Building an evaluation into any educational technology project can have strong implications for the long-term success of the intervention” (Light, 2008, p. 7)

Cummings and Worley (2015) note the importance of selecting appropriate criteria (p. 211) against which to validate the technology. Ideally, instructors will have previously considered those criteria, which will in large part be the criteria that were used to select it in the first place, even if the instructor did not select the technology.

For Accessibility: Research

A number of resources are available to inform instructors and guide decisions around accessibility. The Web Content Accessibility Guidelines (WCAG) provides a page on Accessibility Principles that can be a good starting point: https://www.w3.org/WAI/fundamentals/accessibility-principles/

In addition, the Online Learning Consortium provides valuable pages, includes Accessibility Evaluation Tools and Other Good Resources. Visit https://sites.google.com/sloanconsortium.org/accessibilitytoolkit/home?authuser=0 to learn more.

For Security: Research

It is difficult to provide succinct general guidance on data privacy protections to instructors because needs and concerns will vary so widely depending on a variety of factors, including the complexity of the technology involved, the ways in which it is implemented, the privacy policies of the institution, and any state and federal guidelines.

Instructors who have selected their own technology should be sure to evaluate their usage against all of the prevailing security standards that apply to them. A good place to start in understanding digital privacy law is by reading the high level overview and frequently asked questions about the federal Family Educational Rights and Privacy Act (FERPA), which is applicable in all situations involving student-created content or student-related data. Instructors can learn more at https://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html

Against Outcomes: Research

It is important for instructors to remember that what they “are looking for is not a theoretical understanding of educational technologies or a set of generalized principles about what technology can do, but a contextual understanding of the particular conditions of the[ir] implementation, and the contextual factors that interacted with the intervention, that lead to a specific outcome” (Light, 2008, p. 7). This means that one of the key criteria for judging the effectiveness of a digital tool is the degree to which it supports an instructor’s particular course outcomes. There may be other useful measures — time saved, student engagement — but a tool that provides a measurable increase in student progress toward learning outcomes is a primary goal of nearly all educational technology adoptions.

For Student Support: Research

When considering educational technology and its impact on student achievement, it is useful to remember that “[t]he type of data collected influences its potential impact for decision making” (Irving, 2006, p. 15). The sections that follow offer some suggestions to help instructors to consider the data that will best facilitate their decisions about the effectiveness of the tool in supporting students.

Instructor Presence: Research

The importance of incorporating instructor presence into the usage of educational technology is outlined in “Design for Instructor Presence.” Unfortunately, “[a]lthough teaching presence is important, there is not a consensus on its measurement” (Wang et.al., 2021, “Teaching Presence”). Research indicates a number of useful avenues for consideration. Wang and colleagues highlight five key elements (drawn from Community of Inquiry scholarship): “design and organization, discourse facilitation, direct instruction, assessment, and technological support” (“Preliminary Development of the Teaching Presence Measurement Framework”).

While technological support is a relevant factor in both the instructor presence and in the overall effectiveness of the digital tool, this guide departs from Wang and colleagues by recommending that as much as possible outside resources be leveraged to provide that support.

Student Presence: Research

The importance of incorporating instructor presence into the usage of educational technology is outlined in “Design for Student Presence.” In terms of how to evaluate its effectiveness, Irving (2006) observes that from a student presence perspective, instructors should be looking for “tools [that] provide a window into a learner’s prior and present understandings and feedback loops that support teacher’s instructional decision making and monitoring” (p. 17). In addition, instructors should look for the success of their efforts to build in authenticity (Martin, 2007, p. 81) and community (Gunawardena, 2019) for learners. The relevant measures will vary for individual instructors.

Increased Engagement: Research

Educational technology has “been linked to an increase in behavioural, affective and cognitive student engagement, the facilitation of which is a central concern of educators” (Bond et.al., 2020). However, Kirkwood (2009) reminds us that using technology does not guarantee improved student engagement, which leaves it to instructors and institutions to validate the impact on their student population.

Bond and colleagues (2020) note in their extensive review of the literature on student engagement that 93% of the studies do not offer a definition of student engagement. This leaves instructors to provide a definition that aligns with their individual pedagogy, but one that might be guided and measured by five general characteristics which Bond and colleagues (2020) identified as the most frequent across the works under their review: Participation/interaction/involvement, Achievement, Positive interaction with teachers and peers, Enjoyment, and Learning from peers. There is obvious crossover between these measures for engagement and other measures this guide proposes for student support, so instructors would do well to establish their own criteria at the beginning of the implementation and tease out the practical nuances during their reflection process, after as much data as possible has been gathered (see “Reflecting”).

Student Retention: Research

O’Gorman, Salmon, and Murphy (2015) identify a number of key factors in student retention. Some, such as offering a physical safe space, are beyond the scope of what technology can provide. There are others for which technology is better suited, such as providing avenues to enable flexible support, empower students in their identities, create a sense of community, and create psychologically safe and supportive spaces (O’Gorman, Salmon, & Murphy, 2015).

If student retention is a relevant part of the technology goals for instructors or institutions, initial research should be done on the current state of the relevant metrics before the technology is implemented and clear, relevant measures should be established to gauge the success of any technological interventions.

For Instructional Support: Research

Edmentum (2018) quite succinctly expresses another crucial element of educational technology: that “tools should support teachers and inform classroom instruction.” The sections that follow offer some suggestions to help instructors to consider the data that will best facilitate their decisions about the effectiveness of the tool in supporting their labor as educators.

Time Saved: Research

This is perhaps the most difficult metric to measure during and after the first term of technology implementation because, as Cummings and Worley (2015) observe, there is often “a decline in performance, productivity, or satisfaction as change is implemented” (p. 201). Instructors will recognize this to be true when they consider the quantity of labor involved in launching a new digital tool, even if it is ultimately intended to reduce instructor effort in the long term. To the degree possible, instructors should discriminate between time spent on implementation and time spent (or saved) on recurring course-related tasks when endeavoring to validate the tool on the basis of time saved.

Pedagogy Furthered: Research

Edmentum (2018) reminds us that “[d]igital learning is subject to the same pedagogical best practices as traditional classroom instruction” and that decisions about educational technology should be evaluated against “sound foundations of teaching.” Instructors should consider their own pedagogical goals and practices and evaluate the degree to which the digital tools facilitate or hinder them.

Insights Provided: Research

There is at times a fine line between an educational practitioner and an educational researcher and the use of educational technology tends to further blur those lines. In discussing the affordances of digital tools, Kandara and Kennedy (2020) observe that “[o]n a scale not seen in the past, the educational researcher can peer into ongoing educational processes and collect, analyze and present results on a range of cognitive, affective and behavioral data” (p. 1). In leveraging the same digital tools, the educational practitioner — the instructor — can do the same. In fact, they are explicitly called upon to do so by the International Society for Technology in Education (2022), setting the following standard for educators: “Educators understand and use data to drive their instruction and support students in achieving their learning goals.”

This guide recommends starting slowly. Those looking for a more research-based approach to their use of digital tools should consider Kandara and Kennedy (2020) for further reading. The authors provide an excellent roadmap in their chapter “Educational Data Mining: A Guide for Educational Researchers,” which appears as part of Kennedy and Qian’s Advancing Educational Research With Emerging Technology.

For the majority of instructors, the focus will be on a specific subset of the data: “behavioral, activity, and performance data” (Kandara & Kennedy, 2020, p. 9). Which is to say, the information that the tool can provide around what students are doing, how they are doing it, and the degree to which they are succeeding in their efforts. The affordances of each digital tool will vary widely, as will the degree to which that data is a significant factor in the adoption of the technology.

Against Alternatives: Research

It is probable, given the wide range of educational technology available, that more than one tool might meet the needs of an instructor or institution. In validating a tool against alternatives, Reid (2020) suggests the use of a decision-making matrix in which the affordances of the various tools are mapped out against the requirements of the instructor or institution.

It may be useful to consider “no technology” or “LMS only” as one of the comparatives in the matrix to provide the most complete picture of the available options.

Reflecting: Research

Mantere and Wiedner (2021) point to the importance of establishing periods of reflection during the change process to allow meaningful conclusions to be drawn (p. 806). This dovetails with the scholarship of Kolb (1984) and countless others in the education space who advocate for the value of reflection as a means of assessing progress. These scholars consider reflection to be a key stage of learning — and using a new digital tool is nothing if not a learning process.

Gathering Information: Research

This guide aligns with the research of Wanger (1999), Kelle (2008), and others in recommending that instructors draw on both quantitative and qualitative data as they reflect on their experiences implementing educational technology. Ideally, much of this data will have been collected throughout the term, so now is the moment for instructors to organize their materials and capture any remaining impressions. When all of the information is considered in the aggregate, trends and conclusions will begin to emerge.

Building on a Successful Implementation: Research

It is often simple to point to the moment when a change began — the search for a digital tool was begun, for instance, or a new technology was adopted and rolled out. It is usually much more difficult and less productive to identify a specific ending. This is because change is not a simple moment in time. Instead, as Hernes, Hussenot, and Pulk (2021) remind us, “change needs to be understood…as an ongoing process of becoming” (p. 732). This points to an idea in organization development of continuous change, a model that departs from Lewin’s (1947) traditional approach to change, and which is perhaps more suited to the continuously advancing nature of technology. Continuous change was introduced to the field by Tsoukas and Chia (2002) and acknowledges the ongoing nature of change initiatives. It is useful for instructors to consider their efforts in this context and work toward the associated goal of continuous improvement.

Incremental Improvement: Research

There is a software development approach called Agile, which includes a concept that is applicable here: the notion of incremental implementation. The software company Atlassian (2022) explains it this way:

Instead of betting everything on a “big bang” launch, an agile team delivers work in small, but consumable, increments. Requirements, plans, and results are evaluated continuously so teams have a natural mechanism for responding to change quickly. (“What Is Agile?”)

Much like software development, this is a valuable approach to software implementation. The moment after a successful implementation is not the time for instructors to completely re-think their usage of the tool, but to consider ways to make steady, incremental improvements focused on the areas that will have the largest impact.

User Personas: Research

User personas are employed in a wide array of business contexts, including marketing and Agile software development. They are fictitious potential groups of people who share particular characteristics.  In marketing, these would be segments of the customer base for a product. In software development, they are likely to be segments of the users or intended users of the software. Across all fields, they are intended to provide a different or more nuanced perspective on the needs of the people they represent.

As instructors are considering their technology, creating some basic user personas may provide insights into subsets of their student populations who may experience the technology differently than the majority. They may also help the instructor to better shape their future usage of the tool by creating a more nuanced understanding of its usage.

The sort of streamlined user personas that may be of use to instructors should be composed of two key elements: student attitudes and student behaviors. According to Ferreira, Silva, Barbosa, and Conte (2018), “[a]ttitudes are important for creating empathy with the user, while behaviours are important for designers and developers to understand how the product will be used” (p. 280). Explicitly considering both attitudes and student behaviors — in isolation and in relation to the technology — can provide actionable insights for future improvements.

Course Mapping: Research

This guide recommends a backward design approach to course mapping, as laid out by Wiggins and McTighe (1999), which uses the course outcomes as a starting point. With the outcomes in mind, an ideal approach would be to consider what measures could be used to prove student mastery of those particular outcomes or to prove progress toward mastery. These are the assessments, determined in part by what will be acceptable to the institution as proof of learning, in part by what the instructor feels will best serve students, and in part by the affordances of the technology.

From there, research, ethics, experience, and experimentation lead the instructor to develop the particular activities (“learning events”) necessary to move the dial on student progress (Wiggins & McTighe, 1999, p. 23). These activities in combination with the assessments are the building blocks of a course.

Many instructors already use course mapping when plotting out a syllabus, but often the integration of technology into a course involves only minor adjustments or additions to an already extant whole. Leveraging course mapping is a way for instructors to start from scratch with the technology as a core consideration in the course’s foundational structure. This approach is recommended for instructors who have had successful implementations and have already made some incremental improvements (see “Incremental Improvement”). Course mapping will help instructors to fully incorporate the technology into every aspect of their teaching.

Advocating to Drop an Existing Tool: Research

Fundamentally, advocating to drop usage of a particular technology involves making a business case: drawing on the available evidence to argue for a particular outcome for the benefit of the institution. In preparing to make their case, instructors may want to consider principles of the business case to help frame their efforts:

(McGlaughlin, 2004)

Additionally, instructors will want to provide a clear accounting of the success of the tool against its intended benefits. Presenting a concise, data-driven, and well-organized business case for dropping the technology may not prompt an immediate change, but it can be the starting point for important discussions. Providing all of the necessary information in a single document allows it to be circulated to important stakeholders and serves as a meaningful record of the instructor’s concerns.

James Cook University (2020) offers additional advice for creating a business case that is readily available online: https://online.jcu.edu.au/blog/how-to-write-business-case

License

The Change Management Guide to Incorporating Educational Technology Copyright © by Sherry Mooney. All Rights Reserved.

Share This Book