Definition of Competency or Understanding of Competency
According to Kaplowitz, there is a hierarchical relationship between learning theory, instructional theory, and ID models, where theories about how people learn inform pedagogy, and then instructional (pedagogical) theory influences how instructional designers develop models (2014, p 19). This hierarchal relationship is mirrored in Benjes-Small & Miller where the definition of learning theory (“a conceptual framework describing how learning happens”) is considered foundational, whereas instructional theory (“how instructors should teach based on what we know about what people learn”) is positioned in relation to the learner (2014, p 65). The former is learner-centered, while the latter is ostensibly teacher-centered (Becker, 2015).
Pre-eminent learning theories include cognitivism, which focuses on the internal processes that take place during learning, constructivism, which emphasizes emotional and social aspects of the unique individual, and social constructivism, which sees learning as a “process of developing psychological tools” through social interaction (Benjes-Small & Miller, 2017, pp 68-71; Holliday, 2016, pp 115-116). Learning theories differ in how they:
It seems more difficult to identify prominent instructional theories. The one I could definitively identify as an instructional theory, Sweller’s cognitive load theory (which comes from a cognitivist perspective), considers learner’s “working memory and its limits” within design instruction (Benjes-Small & Miller, 2017, pp 69-70).
Instructional design (ID) is a process to “design, develop, and assess learning experiences” (Benjes-Small & Miller, 2017, p 62), but it also aims to make instruction more effective by applying research from fields like education and psychology (Kaplowitz, 2014, pp 14-15). An instructional design model is a “proscriptive set of guidelines that help instructors structure the process of their teaching and student learning” (Benjes-Small & Miller, 2017, p 65). Many ID models owe a debt to an unattributed series of interdependent ID steps called ADDIE (Analysis, Design, Development, Implementation, Evaluation), that evolved originally among US military personnel in the 1970’s, only to be picked up by instructional designers in the 1980’s (Kaplowitz, 2014, p 20). Prominent ID models from the latter half of the 20th century include:
Other contemporary ID models include Merrill’s First Principles of Instruction, the SAM model, Action Mapping, the Learning Circle Framework, the Morris, Ross, & Kemp model, and Kirpatrick’s Four Levels of Training Evaluation (IDM, 2019). Aside from incorporating learner-centered teaching (LCT) (Kaplowitz, 2014, pp 6-7), ID models target specific learners and generally involve:
Instructional design usually begins with analyzing learner needs (and identifying gaps in their knowledge and abilities) and understanding the learners themselves (Holliday, 2016, pp 112-117; Kaplowitz, 2014, p 32) followed by establishing expected learning outcomes (ELOs) (Kaplowitz, 2014, p 32 & 62). Learning outcomes ask what a “learner should be able to do after instruction,” are student-centered, and are used for assessing students, while learning objectives are similar to instructional goals in that they have primarily the teacher’s interests in mind; learning objects refer to the “package of content, practice, and assessment built around a single learning objective” (Benjes-Small & Miller, 2017, p 65). ELOs ideally, address audience, behavior, condition, and degree (the ABCD formula), use active verbs, and address affective outcomes (Kaplowitz, 2014, pp 65-72). Learning activities are designed and developed based on the learning outcomes (Holliday, 2016, p 120; Kaplowitz, 2014, p 83), and begin with the setting of an instructional goal or goals (Kaplowitz, 2014, p 60).
Before learning activities are completed, instructional designers will determine how students and instruction will be assessed both formatively (during instruction; low stakes) or summatively (after instruction; high stakes); assessment is not simply measurement, but an “essential part of the learning process” (Holliday, 2016, p 118). A core value in instructional design is the idea of alignment: that instruction “should align with the learning outcomes and assessments that provide evidence of learning and the context of learning” (Holliday, 2016, p 120). This ID process, that flows between learning outcomes, learning activities, and assessments, is summed up well by Kaplowitz’s Teaching Tripod (2014, p 32).
Information organizations have been moving in the direction of instructional design for almost three decades: information literacy (IL) has become the “dominant education paradigm,” in libraries, and information literacy instruction (ILI) has eclipsed other library instructional models (Holliday, 2016, p 102). Emerging out of the 1990s and having as its goal the teaching of “specific skills related to finding, evaluating, and using information,” ILI was developed by educational and informational organizations, in response to advances in technology and an economy restructuring around knowledge workers (Holliday, 2016, p 103). These technological advances have favored a more “interactive and participatory learner-centered approach” to teaching (Kaplowitz, 2014, p 19), which some have seen as a direct challenge to traditional academic learning (Gorman, 2012, p 117).
In 2016, after almost two decades of modification, the ACRL developed and released its Framework for Information Literacy for Higher Education, an ILI model which drew upon the pedagogical work of Wiggins and McTighe as well as threshold concepts, knowledge practices, the idea of dispositions, and the concept of meta-literacy (Framework, 2016, p 2). Kaplowitz, noting that both ID (and ILI following in its wake) had become fully enmeshed within the modern technological environment, makes a persuasive argument that ID processes provide librarians with a powerful tool with which they can make “thoughtful and appropriate decisions” regarding whether or not to wield particular technologies (2014, p 6), and thus making us less reactive to its seeming power. Holliday discusses how to make “wise choices about instructional technology” at length (2016, p 124) and concludes her article on ID and ILI by suggesting that library instruction will be strengthened by “thinking programmatically,” (2016, pp 125-129) meaning that instructional design within a library environment can do more, better, if everyone is on board.
For me, literacy and learning are entwined into one of the immutable pillars I see as upholding the values of the information profession. The ancient “encyclopedic dream” (Lynch, 2016, p 382) to amass all of the world’s knowledge is now a daily reality for librarians, who almost literally have the world at their fingertips. Information and knowledge are not the same thing however, and information overload has become a modern malaise. Bibliographic instruction has been partially swept away by information literacy instruction within libraries, and instructional design provides a powerful rudder with which librarian-instructors might begin to ground themselves within solid, new theories and practices.
Preparation to Understand Competency K: Coursework and Work Experience
Information Communities (INFO 200) and Reference and Information Services (INFO 210) both introduced me to the history of bibliographic instruction within reference services. INFO 200 also showed me the value of face-to-face (F2F) engagement and collaborative partnerships. Information Professions (INFO 204) prepared me for understanding how librarians construct broad strategic goals, and how this can affect how librarians go about designing, developing, and evaluating their programs.
For my internship (INFO 294) in XX library’s Eastern Region, I received professional experience working on my own teen program, and assisted librarians providing instructional assistance during their programs. I had the opportunity to discuss how and to what degree librarians perform needs assessments, develop learning outcomes, plan activities and curriculum, and/or assess and evaluate their work. I witnessed these librarians both individually, and in collaborative environments. I collected over a dozen different examples of how librarians organized their programs; the majority did not fully use instructional design systematically, but they all were aware of the instructional design elements they did use. I was able to witness examples of collaborative activity among learners, and in partnerships between individuals and organizations. Finally, I met one librarian who helped me understand what STEM, STEAM, and STREAM curriculums looked like, and gave me a printout that showed how to choose quality, developmentally appropriate STEM resources. I was able to assist her on three occasions with some STEM-related instruction.
The first three evidentiary items I chose each a reflect a learning activity I performed, each involving several steps, that led to a final instructional design plan draft. The fourth evidentiary item I present is the Final Instructional Design Plan itself. I chose to include all four stages in the creation of the ID Plan because they show the effort involved in constructing such a document, and because presenting them in stages shows the iterative process that was involved.
Evidence
Learning Activity 3: Analyze
Discussion of Evidentiary Items
https://documentcloud.adobe.com/link/review?uri=urn:aaid:scds:US:a3428a1b-4951-4a82-9cda-165873ab96f5
For my instructional design plan, I chose to reply to Tabitha Jay, Research & Collections librarian at XY College, who had requested assistance teaching her students how to choose and refine research topics. I agreed to collaborate with another student, and after reviewing instruction examples in Learning Activity 2, set to work. I and my partner interviewed Ms. Jay and colleagues for almost an hour via Zoom, asking numerous questions about the students (their needs, abilities, issues), the university, the librarians, the teachers, and what they might be looking for in an instructional design plan.
For Learning Activity 3, I (and my partner) worked through four steps:
1a. Through an examination of learner characteristics, and extant materials, and by determining who should be providing instruction, I and my partner developed a Needs Assessment Statement.
1b. By asking and answering questions regarding instructional goals, including purpose, desired outcomes, how consensus might be achieved, and the measurability of the goals, and referring to our Needs Assessment Statement, I and my partner crafted a terse Instructional Goals Statement.
2a. After analyzing the XY web page and Research Guide I decided that an interactive Libguide within their web pages would provide a location for our instruction, the form of which we decided would be online tutorials.
2b. I and my partner described our instructional unit as a general concept, and then broke that concept down into specific concepts and specific activities.
3. For Step 3, I and my partner reflected on the learner’s characteristics and entry behaviors, based on our interview with the XY librarians.
4a. In Step 4, I and my partner, taking all of the previous steps into consideration, wrote four (4) Learning Outcomes, describing what the students should be able to have learned after instruction.
4b. In the second part of Step 4 we were to look to our Learning Outcomes and describe how they motivate learners. Here I helped keep my partner (who had the overall vision for how instruction should look and function) on track, editing our description into pithy statements.
This project shows that I understand how to begin an instructional design plan in a collaborative way. I am capable of gathering together a needs assessment based on the unit of instruction and know what information would be relevant to produce such an assessment. Furthermore, I show that I can aggregate that data and use it to illustrate what learner’s characteristics and entry behavior look like. I demonstrate that I can determine instructional goals, based on a needs assessment, and then use that information to construct a cogent instructional goals statement.
This paper provides evidence that I have the ability to describe what an instructional unit will look like conceptually and in context, and then break that description down into specific concepts and activities. I substantiate that I have the thinking, writing, and editing skills necessary to use the foregoing data as the building blocks for constructing clear, explicit learning outcomes, with assessment and evaluation factored in. Finally, I show that I understand how to succinctly describe how these ELOs might motivate learners.
Learning Activity 4: Design
Discussion of Evidentiary Items
https://documentcloud.adobe.com/link/review?uri=urn:aaid:scds:US:2f7661ed-00a0-433f-af85-2fcc906970f1
In Learning Activity 4, I planned instructional strategies with my partner, based on our instructional analysis from LA 3. By this stage, my partner had a fully realized vision for what our tutorials would look like, so this part of the process had me cross-examining my partner’s ideas, and then attempting to corral those ideas into a readable document. This activity required us to describe a) how instruction would be practically presented and delivered, b) how learners would be participating and in what order, c) how learners would be tested and evaluated, and d) how we would measure that learning took place. As a writer I understand the importance of weaving backstory into a plot, and in writing and editing descriptions of our instruction, I found it helpful to fall back on the entry behaviors and the characteristics of our learners. For instance, because student needs ranged, I insisted that we stagger the tutorials from the easiest to the more challenging. The second activity, the topic development worksheet, was at first not described in much detail—my contribution here was to keep pushing my partner to supply more description until the outlines of the instructional unit came into focus. It was also at this point that I convinced my partner to drop the fourth instructional unit (and learning outcome), an online video tutorial; after a great deal of research testing out a dozen or more tutorials, I had concluded that there were none available that fit with our ELO and could also provide proper assessments.
In the learner participation section, I made sure that we articulated a design that not only took into consideration the learner’s skills and abilities, but also their limitations. While my partner was very good at making sure we chronicled every way in which learners would be participating, I ensured that wherever possible, we mentioned the ways in which learners were being motivated to participate and why. I also made sure we included any instances where theory informed our design. For instance, I mention that instructional units are completed in “discrete chunks,” which is a nod to cognitive load theory.
During the construction of the testing and assessment section, I came across the idea of assessment levels in Kaplowitz, and was able to incorporate the concept of behavioral assessments as a baseline way of measuring instruction efficacy (2014, pp 117-123); this proved to be a useful way for my partner and I to come to agreements on specifics. In the final section, where we were to determine whether student learning transferred into future work scenarios, I was able to refer to Kaplowitz’s assessment levels again and show that verification of long-term learning results was not our central concern. My defining the scope of our assessing and evaluating not only helped give a frame to our instructional design, but it allowed my partner and I to communicate better, which in turn allowed us think through the instructional scenarios with greater definition.
This evidentiary item proves that I have the ability to collaborate with others in producing an instructional design plan. I show myself capable here of planning how instruction will be presented, how learners will participate in the instruction, and how they will be assessed and evaluated. Within the writing and editing here there are passages that demonstrate my ability to present a plan in a nuanced way, for instance not simply describing that our instruction was designed for student’s capabilities, but to articulate that we also planned our instruction with student limitations in mind. Similarly, my linking learner participation to motivation, or linking instructional design to theory, indicate that I am capable of connecting theory and research to practical instruction, and am able to articulate that connection. I show that I can provide research and analysis of instructional tools and assess whether they can help fulfill ELOs or provide proper assessments. Finally, I demonstrate that I understand how the particulars of an established instructional design model, like the Teaching Tripod approach, can inform my own design. For instance, my application of Kaplowitz’s assessment levels to our ID plan provided us with a useful structure which helped us craft a better document.
Learning Activity 5: Develop (Instructional Design)
Discussion of Evidentiary Items
https://documentcloud.adobe.com/link/review?uri=urn:aaid:scds:US:83180b08-ed64-41c5-a830-682ef4bcf983
If my analysis in LA 3 produced a Needs Assessment statement, an Instructional Goal statement, and Learning Outcomes, and that work provided data and structure for designing instruction in LA 4, my work in LA 5 was about further developing that instructional design through reviewing and then choosing instructional materials, communication tools, and technical interfaces, and also clarifying how formative and summative evaluations would be delivered.
We began LA 5 with a review of a particular instructional tool, and our choice was to review the OERC (Open Educational Resource Common’s) because it allowed us to describe its “Guide-on-the-Side” technology, which enables students to work through a tutorial while simultaneously, in a split-screen view, interact with a live database. My participation here involved pressing my partner to honestly evaluate her tool of choice, and not avoid addressing some of its limitations. For instance, from testing OERC I found that it did not actually provide an email to student’s with test scores, as we had originally hoped, and thus some of our statements about how we would assess performance had to be called into question. At this stage, my research analysis forced my partner to do more research of her own, which resulted in the discovery of Springshare’s Side-by-Side technology, which could eventually be configured to produce the assessments that we hoped to provide.
Similarly, in the review of classroom communication tools, one of my important contributions was to cross-examine my partner about how much communication was necessary with the various tutorials, both for instruction and for assessment. For instance, we were claiming that the online tutorial, the worksheet, and the interactive “guide-on-the-side tutorials were self-contained units where instruction and assessment required no teacher or librarian intervention. But when pressed for how these instructional units would work in practice, it became apparent that some teacher or librarian intervention might be needed in certain situations. For instance, it became apparent that to get enough assessment data, text-based feedback cards and online polling would be necessary, all of which required librarian or teacher communication tools. In our review of our Learning Management System options I suggested that we might be able to embed the Libguide directly into the LMS, perhaps within the “Help” button, and even link to it with metadata tags, all of which my partner researched and confirmed as possible.
In Step 6, where we finally chose teaching and learning tools and instructional materials, I pressed my partner to make a decision regarding which OER tutorial we would use. I helped us reach a compromise where we would suggest using the free OER tutorial initially but include instructions for how the XY's IT department might, in the future, be able to develop a Springshare version that would provide proper assessments. I was thus able to find a way to express how our instructional tool might be used both practically and ideally within our design plan. In Step 7, where we finally detailed how our instructional tools would be assessed and evaluated in detail, I performed the task of bringing all our disparate assessment writings together in one place, and then challenged my partner to adequately address all the remaining holes in being able to link our assessments to our ELOs. My partner was able to mentally juggle linking the assessments to the ELOs, but I was the one who kept our results honest and accurate in words on the page. I assured that important details, like the mention of assessment tools serving as a cognitive pause, stayed with the final draft. My insistence on our assessments providing fine granularity served us well in our final paper, when these four pages would be reduced down to a very information-dense two pages.
This evidentiary item shows that I collaborate well with others. LA 5 includes passages that describe research and reviewing skills I possess, such as being able to run through online tutorials and analyze their suitability as instructional tools. I illustrate an understanding of communication tools, for instance how an interactive online tutorial might communicate results to a learner. I am able to articulate how an instructional tool might be used both practically and ideally within the same instructional design plan. I demonstrate an ability to describe in detail how instruction will be assessed and evaluated and to ensure that assessment be strictly tied to expected learning outcomes.
Final Instructional Design Plan with Screencast-O-Matic Introduction
Discussion of Evidentiary Items
https://documentcloud.adobe.com/link/review?uri=urn:aaid:scds:US:d6a64edd-a8a9-40eb-8541-2db81deb0d4c
Just before writing our proposal, I came to an agreement with my partner that we would be using Kaplowitz’s Teaching Tripod ID model as our definitive framework for presenting our final instructional design plan. I agreed to this model because of its streamlined and flexible modern design, a feature that was to prove helpful when we began synthesizing 34 pages of analysis, design, and development into a rough draft. The aforementioned work was definitely a collaborative affair, but I did the bulk of the work in transforming this rough draft into a five-minute screencast introduction, as well as a six-page paper capable of being orally presented in a 15-minute timeframe. I accomplished this by removing the scaffolding from our rough draft. I had to make educated guesses regarding what an appropriate length would be for a presentation to our client, what essential information needed to be conveyed, and what information could be cut without diluting our work. Once I settled upon an appropriate length, I had to determine the maximum word count for the plan, and then proportionally how many words could be allotted to each section of the plan, weighing each section in terms of importance.
I sifted through our document innumerable times, line editing the sections with the aim of reducing verbiage, increasing clarity, and keeping only what was essential. For some sections, like the Needs Assessment, this meant me tightening up the language and using bullet points. For analyzing entry behavior and learner characteristics it meant I had to boil 1.5 pages of analysis down into four sentences. The ELOs were the one area where I needed to collaborate with my partner, which involved asking for, and receiving final additions and deletions to these statements. I edited our analysis of learner motivation into a lean quarter-page section, but the parts describing instructional strategies and assessment needed enough room to express how these elements would work in practice. For the former section I mostly pared down the language to be as succinct as possible. However, my work on the assessment section was complicated by two factors: a) last minute changes regarding exactly how assessment and evaluative data would be gathered and measured, especially with the pen-and-paper worksheet, and b) needing to describe how assessments would work with the (more ideal) Springshare’s Side by Side technology, rather than the (more practical) OER Guide-on-the-Side interface. By having a writing plan, and by understanding the relative importance of the instructional design elements, I was able to allow for enough time for these important ID components to be described without overwhelming the stakeholders.
I brought these thinking, designing, writing, and editing skills to bear in the final section, where I summarized how and why our entire instructional design plan was created, described its contents and our design decisions, and provided requirements as well as suggestions for how the plan should be implemented. After the document was finished, I spent the final week of classwork figuring out how to create a screencast introduction for the paper. I researched various free screencast options, and after much testing and inquiry, settled on using the Screencast-O-matic interface. While I experimented with this tool, I began trying to produce a much-abbreviated version of our design plan, an extremely terse summation to use as narration for the screencast introduction. I created about a dozen narratives, testing out each many times on Screencast-O-matic, each time striving to produce a polished introduction in under five minutes. This iterative process required that I problem solve on-the-go. For instance, because many of my test screencasts show me looking off camera at my prompts, I had to devise a large-print script to embed as a Word document on screen, so that my narration would appear more natural and unforced. I also had to prepare my online tutorials ahead of time, in a specific order, so that my narration could flow efficiently and succinctly. Finally, I collaborated with my partner on drafting a proper cover letter for our proposal, where I again refined our ID plan down into a half-page, “in-a-nutshell” summation.
This evidentiary item illustrates my editing and writing skills, but the exhibition of those skills also demonstrates my ability to take learning theory and principles and apply them to instructional design. It takes mental agility to not only produce the content for an instructional design plan, but to perceive what material is essential and what consists of scaffolding. My writing and editing here shows that I can not only distinguish the former from the latter, but that I can separate the two on the page. The writing here illustrates that I understand the various instructional design elements and can not only describe how they interact and interrelate, but also that I can weigh and judge their relative importance based on aims and desired outcomes. This plan provides further evidence that I feel comfortable interacting with new instructional and communicative technologies, and that I intuitively understand that centering instruction around the learner means that instructors and designers, to be properly pedagogically oriented, should also properly think of themselves as learners.
Conclusion
Being capable of positively contributing to a collaborative instructional design planning process is transferable to many professional environments. Some of the specific contributions I have demonstrated— creating a needs assessment, determining instructional goals based on that assessment, providing detailed descriptions of instructional units, or constructing clear, explicit expected learning outcomes— can be applied in these contexts as well. I have shown that I can plan to present instruction in a nuanced way, describing how learners will participate, and how they will be assessed and evaluated, and that I am qualified to assess the efficacy and relevance of instructional tools. These planning skills can be brought to bear in innumerable instructional situations.
On a more conceptual level, I have shown that I understand how prudent application of ID models can affirmatively affect an instructional design plan, and practically I have articulated how assessment of instruction can and must be a tied to expected learning outcomes. This ability to perceive how theory and research necessarily relates to instructional practice (and vice-versa), can be applied anywhere that instructional design takes place. Furthermore, being able to distinguish essential and inter-related ID elements from necessary but temporary scaffolding, while simultaneously weighing and judging their relative importance based on aims and outcomes, will be found to be of value anywhere in the ID process. Finally, being comfortable interacting with new instructional and communicative technology tools, and having an intuitive orientation as a learner, should serve me well in an instructional environment.
References
Becker, K. (2015). Learning theory vs. Instructional theory vs. instructional design model. [Blog
post]. Retrieved from http://minkhollow.ca/beckerblog/2015/07/07/learning-theory-vs-instructional-theory-vs-instructional-design-model/
Benjes-Small, C. and Miller, R.K. (2017). The new instruction librarian: A workbook for trainers and learners. Chicago, Illinois: American Library Association.
Faryadi, Q. (2007). Instructional design models: What a revolution! [Dissertation]. Retrieved
from https://files.eric.ed.gov/fulltext/ED495711.pdf
Framework for Information Literacy for Higher Education. (2020, March 9). American Library
Association. Retrieved from http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/Framework_ILHE.pdf
Gorman, M. (2012). The prince’s dream. SCONUL Focus, 54, pp. 11-6. Retrieved from
https://www.researchgate.net/profile/Michael_Gorman5/publication/271605679_The_Prince's_Dream_A_Future_For_Academic_Libraries/links/5a9c01f6a6fdcc3cbacd3d91/The-Princes-Dream-A-Future-For-Academic-Libraries.pdf
Holliday, W. (2016). Instruction. In Smith, L.C. and Wong, M.A. (Eds.). Reference and
information services: An Introduction. Santa Barbara, CA: Libraries Unlimited
Instructional Design Models. (2019). Retrieved from
https://www.instructionaldesigncentral.com/instructionaldesignmodels
Kaplowski, J.R. (2014). Designing information literacy instruction: The teaching tripod
approach. Lanham, MD: Rowman & Littlefield
Kovacs, D.K. (n.d.). Learning perspectives in a nutshell. Retrieved from
https://www.kovacs.com/info250/lectures/learningperspectivespresentation.pdf
Lynch, J. (2016). You could look it up: The reference shelf from ancient babylon to wikipedia.
New York: Bloomsbury
Pulichino, J. (2019). How learning theory can guide instructional design. [video]. Retrieved from
https://www.lynda.com/Education-Elearning-tutorials/How-learning-theory-can-guide-instructional-design/782136/3508083-4.html
Wiggins, G. & McTighe, J. (2005). Understanding by design. Alexandria: VA: Association for
Supervision and Curriculum Development
According to Kaplowitz, there is a hierarchical relationship between learning theory, instructional theory, and ID models, where theories about how people learn inform pedagogy, and then instructional (pedagogical) theory influences how instructional designers develop models (2014, p 19). This hierarchal relationship is mirrored in Benjes-Small & Miller where the definition of learning theory (“a conceptual framework describing how learning happens”) is considered foundational, whereas instructional theory (“how instructors should teach based on what we know about what people learn”) is positioned in relation to the learner (2014, p 65). The former is learner-centered, while the latter is ostensibly teacher-centered (Becker, 2015).
Pre-eminent learning theories include cognitivism, which focuses on the internal processes that take place during learning, constructivism, which emphasizes emotional and social aspects of the unique individual, and social constructivism, which sees learning as a “process of developing psychological tools” through social interaction (Benjes-Small & Miller, 2017, pp 68-71; Holliday, 2016, pp 115-116). Learning theories differ in how they:
- Understand how learning occurs
- Assess the factors that facilitate learning
- Determine how learning changes behavior and affects performance
- Describe the learner’s role
- Develop assessments and/or measure success (Pulichino, 2019).
It seems more difficult to identify prominent instructional theories. The one I could definitively identify as an instructional theory, Sweller’s cognitive load theory (which comes from a cognitivist perspective), considers learner’s “working memory and its limits” within design instruction (Benjes-Small & Miller, 2017, pp 69-70).
Instructional design (ID) is a process to “design, develop, and assess learning experiences” (Benjes-Small & Miller, 2017, p 62), but it also aims to make instruction more effective by applying research from fields like education and psychology (Kaplowitz, 2014, pp 14-15). An instructional design model is a “proscriptive set of guidelines that help instructors structure the process of their teaching and student learning” (Benjes-Small & Miller, 2017, p 65). Many ID models owe a debt to an unattributed series of interdependent ID steps called ADDIE (Analysis, Design, Development, Implementation, Evaluation), that evolved originally among US military personnel in the 1970’s, only to be picked up by instructional designers in the 1980’s (Kaplowitz, 2014, p 20). Prominent ID models from the latter half of the 20th century include:
- Gagne’s Nine Events of Instruction (1965)— a theory that developed into an early ID model
- Dick & Carey’s model (1978)— one of the first systematic approaches to ID
- Wiggins & McTighe’s Understanding by Design (1998)— which utilizes a concept called “backwards design” and influenced the ACRL Framework for Information Literacy in Higher Education
- Heinrich & Molenda’s ASSURE model (1999)— similar to the Dick & Carey model with an emphasis on technology and media
- Booth’s USER model (2010)— designed by an ILI professional for ILI professionals, and
- Kaplowitz’s Teaching Tripod Approach (2014)— which ties ID design directly to IL instruction (Benjes-Small & Miller, 2017, p 67; Kaplowitz, 2014, pp 17-24; Faryadi, 2007, p 3)
Other contemporary ID models include Merrill’s First Principles of Instruction, the SAM model, Action Mapping, the Learning Circle Framework, the Morris, Ross, & Kemp model, and Kirpatrick’s Four Levels of Training Evaluation (IDM, 2019). Aside from incorporating learner-centered teaching (LCT) (Kaplowitz, 2014, pp 6-7), ID models target specific learners and generally involve:
- Analyzing content/skills to be taught
- Planning instruction
- Preparing materials, tools, delivery mechanisms, etcetera, and
- Designing instruction (Kovacs, n.d.)
Instructional design usually begins with analyzing learner needs (and identifying gaps in their knowledge and abilities) and understanding the learners themselves (Holliday, 2016, pp 112-117; Kaplowitz, 2014, p 32) followed by establishing expected learning outcomes (ELOs) (Kaplowitz, 2014, p 32 & 62). Learning outcomes ask what a “learner should be able to do after instruction,” are student-centered, and are used for assessing students, while learning objectives are similar to instructional goals in that they have primarily the teacher’s interests in mind; learning objects refer to the “package of content, practice, and assessment built around a single learning objective” (Benjes-Small & Miller, 2017, p 65). ELOs ideally, address audience, behavior, condition, and degree (the ABCD formula), use active verbs, and address affective outcomes (Kaplowitz, 2014, pp 65-72). Learning activities are designed and developed based on the learning outcomes (Holliday, 2016, p 120; Kaplowitz, 2014, p 83), and begin with the setting of an instructional goal or goals (Kaplowitz, 2014, p 60).
Before learning activities are completed, instructional designers will determine how students and instruction will be assessed both formatively (during instruction; low stakes) or summatively (after instruction; high stakes); assessment is not simply measurement, but an “essential part of the learning process” (Holliday, 2016, p 118). A core value in instructional design is the idea of alignment: that instruction “should align with the learning outcomes and assessments that provide evidence of learning and the context of learning” (Holliday, 2016, p 120). This ID process, that flows between learning outcomes, learning activities, and assessments, is summed up well by Kaplowitz’s Teaching Tripod (2014, p 32).
Information organizations have been moving in the direction of instructional design for almost three decades: information literacy (IL) has become the “dominant education paradigm,” in libraries, and information literacy instruction (ILI) has eclipsed other library instructional models (Holliday, 2016, p 102). Emerging out of the 1990s and having as its goal the teaching of “specific skills related to finding, evaluating, and using information,” ILI was developed by educational and informational organizations, in response to advances in technology and an economy restructuring around knowledge workers (Holliday, 2016, p 103). These technological advances have favored a more “interactive and participatory learner-centered approach” to teaching (Kaplowitz, 2014, p 19), which some have seen as a direct challenge to traditional academic learning (Gorman, 2012, p 117).
In 2016, after almost two decades of modification, the ACRL developed and released its Framework for Information Literacy for Higher Education, an ILI model which drew upon the pedagogical work of Wiggins and McTighe as well as threshold concepts, knowledge practices, the idea of dispositions, and the concept of meta-literacy (Framework, 2016, p 2). Kaplowitz, noting that both ID (and ILI following in its wake) had become fully enmeshed within the modern technological environment, makes a persuasive argument that ID processes provide librarians with a powerful tool with which they can make “thoughtful and appropriate decisions” regarding whether or not to wield particular technologies (2014, p 6), and thus making us less reactive to its seeming power. Holliday discusses how to make “wise choices about instructional technology” at length (2016, p 124) and concludes her article on ID and ILI by suggesting that library instruction will be strengthened by “thinking programmatically,” (2016, pp 125-129) meaning that instructional design within a library environment can do more, better, if everyone is on board.
For me, literacy and learning are entwined into one of the immutable pillars I see as upholding the values of the information profession. The ancient “encyclopedic dream” (Lynch, 2016, p 382) to amass all of the world’s knowledge is now a daily reality for librarians, who almost literally have the world at their fingertips. Information and knowledge are not the same thing however, and information overload has become a modern malaise. Bibliographic instruction has been partially swept away by information literacy instruction within libraries, and instructional design provides a powerful rudder with which librarian-instructors might begin to ground themselves within solid, new theories and practices.
Preparation to Understand Competency K: Coursework and Work Experience
Information Communities (INFO 200) and Reference and Information Services (INFO 210) both introduced me to the history of bibliographic instruction within reference services. INFO 200 also showed me the value of face-to-face (F2F) engagement and collaborative partnerships. Information Professions (INFO 204) prepared me for understanding how librarians construct broad strategic goals, and how this can affect how librarians go about designing, developing, and evaluating their programs.
For my internship (INFO 294) in XX library’s Eastern Region, I received professional experience working on my own teen program, and assisted librarians providing instructional assistance during their programs. I had the opportunity to discuss how and to what degree librarians perform needs assessments, develop learning outcomes, plan activities and curriculum, and/or assess and evaluate their work. I witnessed these librarians both individually, and in collaborative environments. I collected over a dozen different examples of how librarians organized their programs; the majority did not fully use instructional design systematically, but they all were aware of the instructional design elements they did use. I was able to witness examples of collaborative activity among learners, and in partnerships between individuals and organizations. Finally, I met one librarian who helped me understand what STEM, STEAM, and STREAM curriculums looked like, and gave me a printout that showed how to choose quality, developmentally appropriate STEM resources. I was able to assist her on three occasions with some STEM-related instruction.
The first three evidentiary items I chose each a reflect a learning activity I performed, each involving several steps, that led to a final instructional design plan draft. The fourth evidentiary item I present is the Final Instructional Design Plan itself. I chose to include all four stages in the creation of the ID Plan because they show the effort involved in constructing such a document, and because presenting them in stages shows the iterative process that was involved.
Evidence
Learning Activity 3: Analyze
Discussion of Evidentiary Items
https://documentcloud.adobe.com/link/review?uri=urn:aaid:scds:US:a3428a1b-4951-4a82-9cda-165873ab96f5
For my instructional design plan, I chose to reply to Tabitha Jay, Research & Collections librarian at XY College, who had requested assistance teaching her students how to choose and refine research topics. I agreed to collaborate with another student, and after reviewing instruction examples in Learning Activity 2, set to work. I and my partner interviewed Ms. Jay and colleagues for almost an hour via Zoom, asking numerous questions about the students (their needs, abilities, issues), the university, the librarians, the teachers, and what they might be looking for in an instructional design plan.
For Learning Activity 3, I (and my partner) worked through four steps:
1a. Through an examination of learner characteristics, and extant materials, and by determining who should be providing instruction, I and my partner developed a Needs Assessment Statement.
1b. By asking and answering questions regarding instructional goals, including purpose, desired outcomes, how consensus might be achieved, and the measurability of the goals, and referring to our Needs Assessment Statement, I and my partner crafted a terse Instructional Goals Statement.
2a. After analyzing the XY web page and Research Guide I decided that an interactive Libguide within their web pages would provide a location for our instruction, the form of which we decided would be online tutorials.
2b. I and my partner described our instructional unit as a general concept, and then broke that concept down into specific concepts and specific activities.
3. For Step 3, I and my partner reflected on the learner’s characteristics and entry behaviors, based on our interview with the XY librarians.
4a. In Step 4, I and my partner, taking all of the previous steps into consideration, wrote four (4) Learning Outcomes, describing what the students should be able to have learned after instruction.
4b. In the second part of Step 4 we were to look to our Learning Outcomes and describe how they motivate learners. Here I helped keep my partner (who had the overall vision for how instruction should look and function) on track, editing our description into pithy statements.
This project shows that I understand how to begin an instructional design plan in a collaborative way. I am capable of gathering together a needs assessment based on the unit of instruction and know what information would be relevant to produce such an assessment. Furthermore, I show that I can aggregate that data and use it to illustrate what learner’s characteristics and entry behavior look like. I demonstrate that I can determine instructional goals, based on a needs assessment, and then use that information to construct a cogent instructional goals statement.
This paper provides evidence that I have the ability to describe what an instructional unit will look like conceptually and in context, and then break that description down into specific concepts and activities. I substantiate that I have the thinking, writing, and editing skills necessary to use the foregoing data as the building blocks for constructing clear, explicit learning outcomes, with assessment and evaluation factored in. Finally, I show that I understand how to succinctly describe how these ELOs might motivate learners.
Learning Activity 4: Design
Discussion of Evidentiary Items
https://documentcloud.adobe.com/link/review?uri=urn:aaid:scds:US:2f7661ed-00a0-433f-af85-2fcc906970f1
In Learning Activity 4, I planned instructional strategies with my partner, based on our instructional analysis from LA 3. By this stage, my partner had a fully realized vision for what our tutorials would look like, so this part of the process had me cross-examining my partner’s ideas, and then attempting to corral those ideas into a readable document. This activity required us to describe a) how instruction would be practically presented and delivered, b) how learners would be participating and in what order, c) how learners would be tested and evaluated, and d) how we would measure that learning took place. As a writer I understand the importance of weaving backstory into a plot, and in writing and editing descriptions of our instruction, I found it helpful to fall back on the entry behaviors and the characteristics of our learners. For instance, because student needs ranged, I insisted that we stagger the tutorials from the easiest to the more challenging. The second activity, the topic development worksheet, was at first not described in much detail—my contribution here was to keep pushing my partner to supply more description until the outlines of the instructional unit came into focus. It was also at this point that I convinced my partner to drop the fourth instructional unit (and learning outcome), an online video tutorial; after a great deal of research testing out a dozen or more tutorials, I had concluded that there were none available that fit with our ELO and could also provide proper assessments.
In the learner participation section, I made sure that we articulated a design that not only took into consideration the learner’s skills and abilities, but also their limitations. While my partner was very good at making sure we chronicled every way in which learners would be participating, I ensured that wherever possible, we mentioned the ways in which learners were being motivated to participate and why. I also made sure we included any instances where theory informed our design. For instance, I mention that instructional units are completed in “discrete chunks,” which is a nod to cognitive load theory.
During the construction of the testing and assessment section, I came across the idea of assessment levels in Kaplowitz, and was able to incorporate the concept of behavioral assessments as a baseline way of measuring instruction efficacy (2014, pp 117-123); this proved to be a useful way for my partner and I to come to agreements on specifics. In the final section, where we were to determine whether student learning transferred into future work scenarios, I was able to refer to Kaplowitz’s assessment levels again and show that verification of long-term learning results was not our central concern. My defining the scope of our assessing and evaluating not only helped give a frame to our instructional design, but it allowed my partner and I to communicate better, which in turn allowed us think through the instructional scenarios with greater definition.
This evidentiary item proves that I have the ability to collaborate with others in producing an instructional design plan. I show myself capable here of planning how instruction will be presented, how learners will participate in the instruction, and how they will be assessed and evaluated. Within the writing and editing here there are passages that demonstrate my ability to present a plan in a nuanced way, for instance not simply describing that our instruction was designed for student’s capabilities, but to articulate that we also planned our instruction with student limitations in mind. Similarly, my linking learner participation to motivation, or linking instructional design to theory, indicate that I am capable of connecting theory and research to practical instruction, and am able to articulate that connection. I show that I can provide research and analysis of instructional tools and assess whether they can help fulfill ELOs or provide proper assessments. Finally, I demonstrate that I understand how the particulars of an established instructional design model, like the Teaching Tripod approach, can inform my own design. For instance, my application of Kaplowitz’s assessment levels to our ID plan provided us with a useful structure which helped us craft a better document.
Learning Activity 5: Develop (Instructional Design)
Discussion of Evidentiary Items
https://documentcloud.adobe.com/link/review?uri=urn:aaid:scds:US:83180b08-ed64-41c5-a830-682ef4bcf983
If my analysis in LA 3 produced a Needs Assessment statement, an Instructional Goal statement, and Learning Outcomes, and that work provided data and structure for designing instruction in LA 4, my work in LA 5 was about further developing that instructional design through reviewing and then choosing instructional materials, communication tools, and technical interfaces, and also clarifying how formative and summative evaluations would be delivered.
We began LA 5 with a review of a particular instructional tool, and our choice was to review the OERC (Open Educational Resource Common’s) because it allowed us to describe its “Guide-on-the-Side” technology, which enables students to work through a tutorial while simultaneously, in a split-screen view, interact with a live database. My participation here involved pressing my partner to honestly evaluate her tool of choice, and not avoid addressing some of its limitations. For instance, from testing OERC I found that it did not actually provide an email to student’s with test scores, as we had originally hoped, and thus some of our statements about how we would assess performance had to be called into question. At this stage, my research analysis forced my partner to do more research of her own, which resulted in the discovery of Springshare’s Side-by-Side technology, which could eventually be configured to produce the assessments that we hoped to provide.
Similarly, in the review of classroom communication tools, one of my important contributions was to cross-examine my partner about how much communication was necessary with the various tutorials, both for instruction and for assessment. For instance, we were claiming that the online tutorial, the worksheet, and the interactive “guide-on-the-side tutorials were self-contained units where instruction and assessment required no teacher or librarian intervention. But when pressed for how these instructional units would work in practice, it became apparent that some teacher or librarian intervention might be needed in certain situations. For instance, it became apparent that to get enough assessment data, text-based feedback cards and online polling would be necessary, all of which required librarian or teacher communication tools. In our review of our Learning Management System options I suggested that we might be able to embed the Libguide directly into the LMS, perhaps within the “Help” button, and even link to it with metadata tags, all of which my partner researched and confirmed as possible.
In Step 6, where we finally chose teaching and learning tools and instructional materials, I pressed my partner to make a decision regarding which OER tutorial we would use. I helped us reach a compromise where we would suggest using the free OER tutorial initially but include instructions for how the XY's IT department might, in the future, be able to develop a Springshare version that would provide proper assessments. I was thus able to find a way to express how our instructional tool might be used both practically and ideally within our design plan. In Step 7, where we finally detailed how our instructional tools would be assessed and evaluated in detail, I performed the task of bringing all our disparate assessment writings together in one place, and then challenged my partner to adequately address all the remaining holes in being able to link our assessments to our ELOs. My partner was able to mentally juggle linking the assessments to the ELOs, but I was the one who kept our results honest and accurate in words on the page. I assured that important details, like the mention of assessment tools serving as a cognitive pause, stayed with the final draft. My insistence on our assessments providing fine granularity served us well in our final paper, when these four pages would be reduced down to a very information-dense two pages.
This evidentiary item shows that I collaborate well with others. LA 5 includes passages that describe research and reviewing skills I possess, such as being able to run through online tutorials and analyze their suitability as instructional tools. I illustrate an understanding of communication tools, for instance how an interactive online tutorial might communicate results to a learner. I am able to articulate how an instructional tool might be used both practically and ideally within the same instructional design plan. I demonstrate an ability to describe in detail how instruction will be assessed and evaluated and to ensure that assessment be strictly tied to expected learning outcomes.
Final Instructional Design Plan with Screencast-O-Matic Introduction
Discussion of Evidentiary Items
https://documentcloud.adobe.com/link/review?uri=urn:aaid:scds:US:d6a64edd-a8a9-40eb-8541-2db81deb0d4c
Just before writing our proposal, I came to an agreement with my partner that we would be using Kaplowitz’s Teaching Tripod ID model as our definitive framework for presenting our final instructional design plan. I agreed to this model because of its streamlined and flexible modern design, a feature that was to prove helpful when we began synthesizing 34 pages of analysis, design, and development into a rough draft. The aforementioned work was definitely a collaborative affair, but I did the bulk of the work in transforming this rough draft into a five-minute screencast introduction, as well as a six-page paper capable of being orally presented in a 15-minute timeframe. I accomplished this by removing the scaffolding from our rough draft. I had to make educated guesses regarding what an appropriate length would be for a presentation to our client, what essential information needed to be conveyed, and what information could be cut without diluting our work. Once I settled upon an appropriate length, I had to determine the maximum word count for the plan, and then proportionally how many words could be allotted to each section of the plan, weighing each section in terms of importance.
I sifted through our document innumerable times, line editing the sections with the aim of reducing verbiage, increasing clarity, and keeping only what was essential. For some sections, like the Needs Assessment, this meant me tightening up the language and using bullet points. For analyzing entry behavior and learner characteristics it meant I had to boil 1.5 pages of analysis down into four sentences. The ELOs were the one area where I needed to collaborate with my partner, which involved asking for, and receiving final additions and deletions to these statements. I edited our analysis of learner motivation into a lean quarter-page section, but the parts describing instructional strategies and assessment needed enough room to express how these elements would work in practice. For the former section I mostly pared down the language to be as succinct as possible. However, my work on the assessment section was complicated by two factors: a) last minute changes regarding exactly how assessment and evaluative data would be gathered and measured, especially with the pen-and-paper worksheet, and b) needing to describe how assessments would work with the (more ideal) Springshare’s Side by Side technology, rather than the (more practical) OER Guide-on-the-Side interface. By having a writing plan, and by understanding the relative importance of the instructional design elements, I was able to allow for enough time for these important ID components to be described without overwhelming the stakeholders.
I brought these thinking, designing, writing, and editing skills to bear in the final section, where I summarized how and why our entire instructional design plan was created, described its contents and our design decisions, and provided requirements as well as suggestions for how the plan should be implemented. After the document was finished, I spent the final week of classwork figuring out how to create a screencast introduction for the paper. I researched various free screencast options, and after much testing and inquiry, settled on using the Screencast-O-matic interface. While I experimented with this tool, I began trying to produce a much-abbreviated version of our design plan, an extremely terse summation to use as narration for the screencast introduction. I created about a dozen narratives, testing out each many times on Screencast-O-matic, each time striving to produce a polished introduction in under five minutes. This iterative process required that I problem solve on-the-go. For instance, because many of my test screencasts show me looking off camera at my prompts, I had to devise a large-print script to embed as a Word document on screen, so that my narration would appear more natural and unforced. I also had to prepare my online tutorials ahead of time, in a specific order, so that my narration could flow efficiently and succinctly. Finally, I collaborated with my partner on drafting a proper cover letter for our proposal, where I again refined our ID plan down into a half-page, “in-a-nutshell” summation.
This evidentiary item illustrates my editing and writing skills, but the exhibition of those skills also demonstrates my ability to take learning theory and principles and apply them to instructional design. It takes mental agility to not only produce the content for an instructional design plan, but to perceive what material is essential and what consists of scaffolding. My writing and editing here shows that I can not only distinguish the former from the latter, but that I can separate the two on the page. The writing here illustrates that I understand the various instructional design elements and can not only describe how they interact and interrelate, but also that I can weigh and judge their relative importance based on aims and desired outcomes. This plan provides further evidence that I feel comfortable interacting with new instructional and communicative technologies, and that I intuitively understand that centering instruction around the learner means that instructors and designers, to be properly pedagogically oriented, should also properly think of themselves as learners.
Conclusion
Being capable of positively contributing to a collaborative instructional design planning process is transferable to many professional environments. Some of the specific contributions I have demonstrated— creating a needs assessment, determining instructional goals based on that assessment, providing detailed descriptions of instructional units, or constructing clear, explicit expected learning outcomes— can be applied in these contexts as well. I have shown that I can plan to present instruction in a nuanced way, describing how learners will participate, and how they will be assessed and evaluated, and that I am qualified to assess the efficacy and relevance of instructional tools. These planning skills can be brought to bear in innumerable instructional situations.
On a more conceptual level, I have shown that I understand how prudent application of ID models can affirmatively affect an instructional design plan, and practically I have articulated how assessment of instruction can and must be a tied to expected learning outcomes. This ability to perceive how theory and research necessarily relates to instructional practice (and vice-versa), can be applied anywhere that instructional design takes place. Furthermore, being able to distinguish essential and inter-related ID elements from necessary but temporary scaffolding, while simultaneously weighing and judging their relative importance based on aims and outcomes, will be found to be of value anywhere in the ID process. Finally, being comfortable interacting with new instructional and communicative technology tools, and having an intuitive orientation as a learner, should serve me well in an instructional environment.
References
Becker, K. (2015). Learning theory vs. Instructional theory vs. instructional design model. [Blog
post]. Retrieved from http://minkhollow.ca/beckerblog/2015/07/07/learning-theory-vs-instructional-theory-vs-instructional-design-model/
Benjes-Small, C. and Miller, R.K. (2017). The new instruction librarian: A workbook for trainers and learners. Chicago, Illinois: American Library Association.
Faryadi, Q. (2007). Instructional design models: What a revolution! [Dissertation]. Retrieved
from https://files.eric.ed.gov/fulltext/ED495711.pdf
Framework for Information Literacy for Higher Education. (2020, March 9). American Library
Association. Retrieved from http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/Framework_ILHE.pdf
Gorman, M. (2012). The prince’s dream. SCONUL Focus, 54, pp. 11-6. Retrieved from
https://www.researchgate.net/profile/Michael_Gorman5/publication/271605679_The_Prince's_Dream_A_Future_For_Academic_Libraries/links/5a9c01f6a6fdcc3cbacd3d91/The-Princes-Dream-A-Future-For-Academic-Libraries.pdf
Holliday, W. (2016). Instruction. In Smith, L.C. and Wong, M.A. (Eds.). Reference and
information services: An Introduction. Santa Barbara, CA: Libraries Unlimited
Instructional Design Models. (2019). Retrieved from
https://www.instructionaldesigncentral.com/instructionaldesignmodels
Kaplowski, J.R. (2014). Designing information literacy instruction: The teaching tripod
approach. Lanham, MD: Rowman & Littlefield
Kovacs, D.K. (n.d.). Learning perspectives in a nutshell. Retrieved from
https://www.kovacs.com/info250/lectures/learningperspectivespresentation.pdf
Lynch, J. (2016). You could look it up: The reference shelf from ancient babylon to wikipedia.
New York: Bloomsbury
Pulichino, J. (2019). How learning theory can guide instructional design. [video]. Retrieved from
https://www.lynda.com/Education-Elearning-tutorials/How-learning-theory-can-guide-instructional-design/782136/3508083-4.html
Wiggins, G. & McTighe, J. (2005). Understanding by design. Alexandria: VA: Association for
Supervision and Curriculum Development