Tuesday, May 27, 2008

Sending you a word document

Dear Teresa,
If you send me your email address, I will send you a word document of the critique. In this format, the chart don't show up, or at least, I don't know how to transport them.
Jeannie

Critique EDER 679.20 - Blended Learning

Dear Teresa,
I have had computer problems most of this evening. As a result, I have not finished editing the critique. It's too long and not in essay format. So go for it and review away!!! I look forward to 'hearing' what you have to say.
Jeannie





CRITIQUE

‘ASSESSING SOCIAL PRESENCE IN ASYNCHORNOUS TEXT-BASED
COMPUTER CONFERENCING’





Jeannie Locatelli

May 27, 2008




























The following is a synthesis, critique and reflections of Rourke, Anderson, Archer, and Garrison’s paper entitled, ‘Assessing social presence in asynchronous, text-based computer conferences’ (1999). In their paper, the authors support the use of computer-mediated conferencing (CMC) in the asynchronous distance-learning format used in many higher education courses. They see CMC as a way to develop the higher level thinking that higher education engenders, and where social presence is necessary to support teaching and cognitive presence. Figure 1 below illustrates this in the ‘Community of Inquiry’ model presented by Garrison, Anderson, and Archer (2000). Rourke et al’s 1999 paper describes a template the authors’ research to ascertain whether it is reliable in assessing the level of social presence in individual courses using the CMC format. Because CMC is being used by many universities in a growing number of courses, the authors feel, “it is important (a) to develop research methods that explore the nature of teaching and learning in these environments; (b) to apply these tools in authentic contexts; and (c) to use the results to develop instructional models that use this technology effectively” (Rourke, Anderson, Archer, & Garrison, 1999).








Figure 1
Content analysis is used by Rourke et al (1999), to conduct quantitative research on social presence in CMC. As stated in the paper, categories needed to be developed in order for the researchers to decipher data in a meaningful way, and these are: 1. interactive responses, 2. affective responses, and 3. cohesive responses (1999). The following table (Figure 2) shows these three categories along with the 12 indicators used by the researchers. Interactive responses have three indicators, affective responses have six responses, and cohesive responses have three indicators (Rourke et al, 1999).





Figure 2
Table 1: Model and Template for Assessment of Social Presence

Methodology:

The pilot test to measure social presence was conducted on two distance graduate level courses each 13 weeks long, and supported by CMC. The chart (Figure 3) below illustrates the methodology used in this test.
Figure 3
Course

Duration
Participants
Instructor
Moderators
Messages
Word Count
Selection 1
Transcript A
graduate- level, workplace learning,
FirstClass conferencing system
13 weeks
14
passive only active at closing
used positive reinforcement in summary
two students

similar roles
90
24,132
Selection 2
Transcript
B
graduate- level, theory and practice of distance learning,
WEBCT conferencing system
13 weeks
17
active
throughout conference both as participant and instructor
two students

similar roles
44
6,260

Results:
The ‘social presence density’ was established by comparison of the transcripts of each course. Although there were “2.5 times as many instances of social presence in Transcript A [Selection 1] than in Transcript B [Selection 2]” (Rourke et al, 1999), there were many more messages and words in A than in B; therefore, the aggregate social presence density for Transcript B was greater than that of Transcript A. This was based on the number of words in each transcript. “Thus to compare two selections [transcripts], we sum the raw number of instances and then divide by the total number of words. This allows for a more meaningful comparison of transcripts and should also facilitate comparisons across studies” (Rourke et al, 1999). Because the values of comparison were so small the coders increased the totals of each transcript [selection] by 1000.
Discussion:
The purpose of this study was to test the research tool. On first glance, of the chart above, it would appear by shear numbers of messages and words that Selection 1 would have more social presence density that Selection 2. However, through the correct use of research analysis, the data proved otherwise. This encourages the researchers that their template is able to ‘see’ further into the reality of the situation. Further analysis of the template shows that the ‘Interactive Responses’ may be better suited to a continuum scale. The researchers are analyzing if each indicator needs to carry the same weight.
It is recognized that social presence is necessary to deepen academic learning, and it would be important in social presence density analysis to be able to indicate when too much social presence begins to equal to too little social presence. In other words, the authors would like to use their research tool to indicate a balance of optimum social presence density.
Conclusion:
The researchers feel that the indicators in this study are likely to be altered, as some are not strong enough in the design of the study to give clear information. Triangulation is also a clear possibility as further investigation needs to be done on student “participation, attitude, and learning in the community of inquiry” (Rourke et al, 1999). The hope is that a mechanical analysis can be achieved in the future, and that this may be achieved with the addition of more indicators. Overall, it is felt that there is a strong basis for optimism that social presence density can be measured within cognitive and teaching presence. In addition, it is important to link the data found in social presence density with desired learning outcomes.
Critique:
The article embeds social presence within the ‘Community of Inquiry’ model by Garrison, Anderson, and Archer (2000). It is clear that social presence in face-to-face learning in higher education is significant in achieving a high standard of learning outcomes. The article’s purpose is to identify if a particular research design can be used to measure social presence density within computer-mediated conferencing (CMC) courses. The validity of the study shown from a quantitative research perspective is good, in that the method of comparing the indicators is able to uncover social presence density between two courses, which show dramatically different response outcomes in the number of messages and words. Despite an obvious desire to choose the course with the higher number of messages as probably having the higher number of social interactions, it was shown through an organized method that the opposite was true. What the researchers in the study concluded was not all the indicators used had equal weight, and that future use of this research tool would require a refinement of the indicators and a possible mixing of the value system to better reflect the type of indicator. It was concluded that triangulation might be necessary to conduct grounded studies in social presence density. The researchers felt student participation, attitude, and adaptation to a community of inquiry may be better served using qualitative research methods.
I agree with the conclusion of the study that the method of research had both strong and weak components. The researchers are trying very hard to formulate a quantitative research tool to make the task less cumbersome; however, my feelings are the nature of social presence is more conducive to a qualitative research format. The suggestion by the researchers in their conclusion that triangulation may be required indicated to me they are thinking along similar lines.
The researchers’ suggestions for the direction of further use of their model and template for the assessment of social presence also leads to confusion for the reader, as they support possible triangulation; however, still want more and more indicators that will help to mechanize the research process. This conclusion along with the following problems in organization of the article leads to the need for further clarity of the assessment tool and the article in general.
Reflection:
I have been a face-to-face teacher/instructor for a very long time, and it is clear that social presence is a wonderful foundation for cognitive and teaching presence. Moving to CMC courses places a great deal of pressure on the instructor to create what many of us have enjoyed about learning, and that is the interaction through discussions, projects, and debate that can very easily be created in face-to-face courses. Although it is necessary to further refine the assessment tool, I feel it is very important to have a method to measure social presence density in CMC courses. I found it interesting that there is a correlation between the amount of time it takes to create social presence in a CMC course, and the amount of time it takes to measure social presence in the same course. We all had the impression technology would make life easier!
One of the observations noted in the case of the two selections/transcripts in the study was the one where the instructor was most active, which showed the highest level of social presence density. It would be interesting to analyze the transcript to see if the instructor, or the students added to the social presence. This measurement would be interesting to study.
When designing my own blended learning course, I would hope to have the face-to-face component of the course first. Having been in eight CMC courses since 2004, I have seen how successful introductions have been in the first week of the courses. The instructors, who took time to develop introductions, help new online learners to navigate through the maze of new applications, and re-orient all the students to the new course, set the best tone that usually lasts throughout. My preference would be to do the same in a face-to-face setting. Because of the humanistic side of teaching, I also feel the computer conferencing would be better received once the instructor and participants had a vision of one another.
A thought arose with me while reading various articles concerning social presence, and how there was so much doubt that social presence can be achieved through a text based medium. The thought is this. For centuries, people used letter writing as a means of communication. Pride was taken in how information and emotion was conveyed. There was no instant verification between the writer and the recipient. It seems to me we are beginning to recapture some of that by dealing with a text-based course. Time is required to digest and properly respond to messages, and I think it is redeveloping the lost art of thoughtful written communication.
Literature Review:
The article by Swan and Shih (2005) support Rourke et al (1999) in the need to develop triangulation to further study social presence in the CMC. The study conducted by Swan and Shih (2005) incorporated both quantitative and qualitative methods. An important finding in this study pointed to the importance of instructor presence, instructional design, the students’ input into the online experience (131). The use of indictors for this study are somewhat different; however, fall under the same three categories of affective, cohesive, and interactive.
Garrison and Vaughan (2008) point out how there has been an evolution of attitude toward the existence of social presence in online learning. When online learning was introduced, the greatest concern was how social presence could be developed if the students were not face-to-face. They indicate that social presence alone is not enough, and that it must support cognitive and teaching presence as well (20-21). Further, design of blended learning requires intentional incorporation of all three components throughout the course (47).
Hughes, Ventura, and Dando (2007) were interested to see if Rourke et al’s research on social presence had any validity in the UK higher education system. In fact, they found that it could be used in the UK with some modifications, and they sited the affective category as needing some attention. Their revised social presence coding template used the same categories: affective, interactive and cohesive. They reduced the number of indicators to 11, and added to further columns: one titled ‘criteria’, and the other, ‘keywords’ (25). These researches feel that social presence density can continue to be studied with refinements to meet all the diverse groups.
In their ‘Guidelines for practice’ chapter in Garrison and Anderson (2003) emphasize the role of the instructor in online learning as being versatile. It takes a practiced instructor to know when to step forward and when to facilitate. The statement ‘All these roles require teaching presence with an education goal in mind’ (81), indicates the complexity of social presence and how it is strongly embedded in intentional instructional design.
Is the article well written and organized?
This article flows in a logical way and from that viewpoint it is easy for the reader to picture the rationale and the research methods of the writing. However, there were a number of inconsistencies that would throw a novice reader of research articles completely off their train of thought.
Number one: Under ‘Content Analysis of Social Presence’, the authors discuss an effective way of studying a subject like social presence by using quantitative content analysis. Then in the ‘Methodology’ section, qualitative analysis using AtlasTi is clearly discussed. Going back to the ‘Unit of Analysis’ section, it appears that the method of coding is based on quantitative research methods. Overall, it is very confusing for the reader to decipher which research method is actually being used.
Number two: Again under ‘Content Analysis of Social Presence’, in the fifth paragraph the authors discuss categories and relabel Garrison et al’s categories to interactive responses, affective responses, and cohesive responses, in this order. However, when expanding on these responses, the order is changed to affective, interactive and cohesive. The reader is left to wonder if there is a shift in importance of these responses without an explanation.
Number three: Between ‘Methodology’, ‘Results’, and ‘Discussion’, the authors speak of Selection 1 and 2 in ‘Methodology’, yet discuss them in ‘Results’ and ‘Discussion’ as Transcript A and B. There is no indicator for the changing of terms referring to each of the two courses.
Number four: The use of the category, ‘indicators’ in the ‘Table 1: Model and Template for Assessment of Social Presence’, is clear; however, the text description of the three responses does not clearly highlight the term ‘indicators’ except in the ‘Cohesive Responses’. This further confuses the reader, and leads to unnecessary analysis of the text to place the correct information in a logical way.
Number five: Table 1 and Table 2 are exactly the same and again the reader is left confused about what is coded text. It looks like the example section of the table gives this. It would have been a good idea to include an example of how the coders used the table. Perhaps that is what Table 2 was meant to show.

References:

Garrison, D. R., & Anderson, T. (2003). E-Learning in the 21st Century: A Framework
for Research and Practice. New York: RoutledgeFalmer

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based
environment: Computer conferencing in higher education. The Internet and Higher Education , 2(2-3), 87-105.

Garrison, D. R., & Vaughan, N. (2008). Blended Learning in Higher Education. San
Francisco: Jossey-Bass

Hughes, M., Ventura, S., & Dando, M. (2007). Assessing social presence in online
discussion groups: a replication study. Innovations in Education and Teaching
International, 44(1), 17-29

Rourke, L., Anderson, T., Archer, W. & Garrison, D. R., (1999). Assessing social
presence in asynchronous, text-based computer conferences, Journal of Distance
Education, 14 (3), 51-70. Retrieved on July 8, 2007 from http://cade.icaap.org/vol14.2/rourke_et_al.html

Swan, K. and Shih, L.F. (2005). On the Nature and Development of Social Presence in
Online Course Discussions. Journal of Asynchronous Learning Networks, 9 (3).

Sunday, May 25, 2008

Blog started!

Here I am Norm. Does it work?
Jeannie