Sunday, June 26, 2011

Chapter 5 & 6

These two chapters are to me the most critical factors for success for an online class. The first is building the web site. I think this is important because sometimes if people do not know web design they will include additional HTML scripts that slow down the processes of the page. The other is updating and maintaining the e course and sight. At school we are on a 3 year review process, but the courses might need to be updated before that, for reasons like bookchanges, new facts and figures, or realizing there maybe too much or too little content.

Saturday, June 11, 2011

The Three e's

Well initially I was thinking how is this article any different than what we have already read. But what I did find was very nice was Merrill stating how problem-based and problem-centered are different. I think that I would have considered problem-based, as part problem-centered and not actually making the students solve a problem.

But moreover from that, Merrill gives us a really good description of problem-center, Effective peer interaction, and technology enhanced interface. The problem-centered we have already covered. Effective peer interaction really just outlines that there needs to be a mutual push and pull from partners, and the collaboration is what makes it very good. Technology enhanced interface really shows how to incorporate media into presentations.

My critique on this ariticle would be that it does not seem to be e-learning driven. How can students pull the information to themselves? I recently read another article in which elearning should not be linear but more where the learner can focus on what they don't really understands, and then they can create how the information builds on the previous. I will have to find that article for everyone. But that is my one critique.

Tuesday, May 31, 2011

Preparing and Testing a Prototype

So Chapter 3 (Preparing for testing a prototype) and Chapter 4 (Testing a prototype) essentially set up the reader for what a prototype involves and how to use it. Each chapter is divided into 5 parts, they are as follows.

Chapter 3- Preparing for testing a prototype
  • Pretest-by doing a pretest we will know what topics a student has already mastered, and by incorporating post tests, what they were not able to master.
  • Design on paper-This makes it seem more like a draft of course. This will also let later stages be more hands on
  • Develop the prototype- This should capture the high level of what your site will look like. It should not be completely done, because it will take up a lot of time. The information should be very basic for the prototype for this reason as well
  • Apply Instructional Principles- enough said :)
  • Test the prototype yourself
Chapter 4-Testing the prototype (because I am sure we all over look thing that we design)
  • Authentic Subjects, Authentic Tasks, and Authentic Conditions- These all need to take place to get the best results from the prototype.
  • Hold a pilot session- so you can visually see what the learner is doing and where they are having problems
  • Observer guidelines- well I can give more of a personal story about this. When it comes to testing a prototype we have a lot of ownership as ID's. So we should try not to debate what we were trying to do, we should not sway learners in the directions. We just need to observe, and ask questions pertaining navigation.
  • Conducting the sessions- here the visual explains it all! It tells us what to do during the actual session from greeting to thanking them.
  • Analyze, analyze, analyze- There is actually 2 parts to the analyze section. I think that I would change how they describe it, but that is because of my task analysis class that I took. Essentially you identify trends, and then define the problems.
For my critique as I said, I think that I would look at it from the perspective of my task analysis class. First we map out what most people do. Then we highlight different bottlenecks, and then we analyze the bottle necks. I think showing it in a process map is very visual and easy to pin point problems.

Another Critique I would have is beyond the actual text, because I find the text overall to be pretty good. I think the layout was very hard to follow more so than anything, but that might just be because I am reading it from my computer. So that is another thing for us to keep in mind as ID's that some people may print content and other's may not. This may make it challenging for learners.

Sunday, May 29, 2011

Week 3.2 Effective Web Instruction

So knowing Dr. Frick and Prof. Boling it is really interesting to see how the two of them collaborated to make this document. I mean I know which parts Boling put in because of her amazing sense of design.

Chapter 1 just kind of allows us to get our feet wet with what they plan to talk about. It introduces us to this idea of inquiry based design, or creating web instruction by using data.

Chapter 2 really starts laying out the process. Although the process map is really confusing. Almost like a circle but some arrows are going the wrong way on the right side, so it was kind of difficult to follow (Look at page 4). The table shows for inquiry based design what teams need to be able to have in order to be successful. Knowing HTML, Analysis, graphic design, it all is important.

I would have to say that from Chapters 1 & 2 Inquiry-based design still is not defined well enough for me. Chapter 2 has a section "So what is this inquiry based instructional design" and it really is not defined here. I know from chapter one and from just the term inquiry that it is data driven, but how is it good, and why is the the "effective" way? We just learned that problem-based was very good, so how is this better or worse?

Week 3.1 Mager, R.F. (1984). Preparing instructional objectives. (2nd ed.). Belmont, CA: David S. Lake

Mager (1984) does a nice job at informing readers what an objective does. It is simple enough that people outside the field should be able to understand. Which is nice for when IDs go to work with SME's.

Mager states that a learning objective is what a learner should be able to do once finished with the training, in order to be competent. I think sometimes we forget about competence, and kind of make what we measure an objective to be more than just competence. So it is important to keep that in mind.

Additionally an objective is what the learner should be able to do when they are done. These are not the topics that the instructor teaches, but usually they go hand and hand. That is the good thing about objectives. They set standards, but each instructor has their own way of teaching that they are better at. So as long as the instructors methods meet the end outcome then they are being compliant.

So Mager also does a good job at describing the parts of an objective. The audience, the behavior (overt or covert), the condition, and the degree. I always have problems figuring out the the difference between the condition and the degree, because to me they go together. I always like to think of it as the learner will do this by doing this at this level. That usually snaps me back into separating the condition and the degree.

I also would like to note that as ID's we need to take out the ID language. When I first was designing a course I was trying to use the most fancy words from Bloom's Taxonomy, but no one besides the few that had master's degrees knew what I was saying.

Anyways I would have to say my one complaint of objectives rather than the article because I liked it a lot, would be if someone can meet the course objectives for instance, does that mean that they passed the class. If that is ultimately what we are testing on, if they can do all of those then they should pass the class. I think this is where education falls short. Students are getting graded on participation and attendance, but is this really helping measure the objectives?

Sunday, May 22, 2011

Week 2.2 CHANGES IN STUDENT MOTIVATION DURING ONLINE LEARNING

This study looks at why learners lose motivation during Self Directed E-Learning. This study found the results (relevance, technology, and age) that decrease motivation through out SDEL. The authors were able to relate the motivation to student satisfaction. And from there were able to link the problems with principles to sustain motivation in SDEL. The principles include:
  • Making sure content is relevant
  • It is at the appropriate learner's difficulty level
  • Incorporate multimedia
  • stimulate real-world situations
  • Provide hands on activities
  • Give appropriate feedback
  • Ease of navigation
  • Incorporate social interaction
I can agree with these for the most part. I think that they are important. But I think I like the way the 5 Star System (Merrill 2001) better. I think being able to rank the principles gives a more visual way to see weaknesses.

The first criticism I have is that this is only for SDEL. I think that perhaps if they evaluated a distance education program they might find that motivation remains higher than SDEL. I can give an example of this. For work orientation, I had to learner different LMS tools. Well the tool is already user friendly and I have used two different LMS in the past. I also learn better just being thrown into software, and only looking things up when I need help. But I am in this distance course and I can say that my motivation is higher. Plus there is a level of interaction with other students that make you want to learn more.

The article stresses that there are three overarching factors that hinder self directed e-learning or also known as SDEL. These factors are internal, external, and personal. From here I would have to disagree. I think that there is know need for personal. All factors are either internal or external and personal problems factor into one of those.





Wednesday, May 18, 2011

Week 2: Merril (2001) 5 Star Instructional Design Rating

Merrill designed a 5 star rating system for Instructional Design. In this system a course can earn the stars by meeting different criteria. There is an opportunity to receive five stars, varying in color (bronze, silver, and gold). This star system is limited to specific types of courses, like architecture tutorial or experiential. This being that receptive and exploratory courseware consist of an information push opposed to allowing learners to make connections.
The Five Stars are as follows


  • Problem based

  • Activation

  • Demonstration

  • Application

  • Integration

So I think that problem based and activation will be easy to understand. But by demonstration does the elearning show the task clearly, will they then be able to apply it to other situations, and finally will the participant be able to integrate it into daily routines.


Through the reading it does not really tell you how to label what type of rating it gets. It just tells the reader how many stars and that each could be a different color. So I am just writing it the easiest way for me to explain it.


I was having a hard time looking at the suggested elearning to grade, but during the process I stumbled upon http://mentor.ucs.indiana.edu/~frick/r547/2006/onlinetesting/


Problem based- I give this a bronze star. It does not seem to identify a problem that I may have, but it does tell (not show) the task that I will be able to come away with.
Activation- I am interested in this, because as an ID creating e-courses, I have to take the instructors paper based tests and put them online. It did not interest me from the beginning and had I not had to rate it, I think I would have stopped after the first couple of tasks. I was really only interested when I found out you can use it for LMS. Silver star
Demonstration- It does a good job demonstrating! Sometimes though I could have just used a simple picture, but that is because I am a little use to working my way around new software. Gold Star
Application- I don't know what to give the application. The course tells the reader to work in the actual file, which is good, but it does not give you immediate feedback. Essentially the learner finds out if they are successful or not by the expected outcomes description the course gives. Bronze Star


Integration- No Star. The Course did not let the learner showcase what they learned or have any sort of open discussion. Perhaps if there was a forum at the end then this could help them integrate.


The other e-learning course we as a class looked at is located here http://payson.tulane.edu/courses/ltl/projects/entrepreneur/main.swf


This one I found very good! Here is my break down for my ratings


Problem based- It shows several examples that are problem based. For instances the Mexican Restaurant, will Russians eat Mexican food? Gold Star
Activation- By providing the different types of indutries, it allows students to be activated by them. Gold Star
Demonstration- It demonstrates in the sections and as you get into the sections at the left then the student gets a chance. Gold Star
Application- The learner gets a chance at applying what they have learned in the previous industrys to each of the remaining industries. And ultimately by creating their own business.
Integration- Students were able to create their own business, but I am not sure if they were able to showcase it or not. I don't know if it was my browser, but in the create your own business, it had no option to write ideas into. But maybe it was supposed to be that way. It asks for evidence, and so I think learners are able to defend their position by providing evidence. I give it a Silver Star.


I look forward to seeing how everyone one else rated the courses they found!