Exchange of Ideas |
|
1. I had casual conversations with people from schools other than my own. |
Targeting EXPLORATION
· To what extent are participants getting to know others, having casual conversations, learning the "other story".
· Conversations outside the context of focused sessions.
· Social conversations where people learn about each other and their situations. |
2. During sessions, I had conversations on topics related to teaching and learning. |
Targeting INVESTMENT
· - To what extent are participants engaged in conversations about teaching and learning, taking in and processing professional content.
· - Conversations within sessions that are content focused.
· - Professional conversations where people learn about and explore specific ideas related to teaching and learning. |
3. In casual conversations, I shared with other people the ideas I encountered today. |
Targeting IDEA SPREAD
· - To what extent are participants engaged in talking about newly encountered ideas.
· - Conversations outside the context of focused sessions.
· - Social conversations where people share what they have learned or experienced at the event. |
Professional Networking |
|
4. I exchanged contact info.with people for the purposes of future professional interactions.
or maybe
I swapped contact info. with people for work-related interactions in the future. |
Targeting BUILDING LEARNING COMMUNITY
building connections (adding new people to the network so that there are resources available when a learning need arises)
Rajagopal, K., Brinke, D. J., Van Bruggen, J., & Sloep, P. B. (2012). Understanding personal learning networks: Their structure, content and the networking skills needed to optimally use them. First Monday, 17(1), 1–12. http://doi.org/10.5210/fm.v17i1.3559 |
5. I re-connected with people today that I don't often get to see. |
Targeting MAINTAINING LEARNING COMMUNITY
maintaining connections (keeping in touch with relevant persons) (Nardi, et al., 2000; Nardi, et al., 2002) |
6. I sought out a specific person today for the purposes of professional interaction. |
Targeting ACTIVATING LEARNING COMMUNITY
activating connections with selected persons for the purpose of learning (Nardi, et al., 2000; Nardi, et al., 2002) |
Professional Learning |
|
7. I learned things today that will improve my professional practice. |
Targeting IMPROVED PRACTICE
Intensive professional development, especially when it includes applications of knowledge to teachers’ planning and instruction, has a greater chance of influencing teaching practices and, in turn, leading to gains in student learning.
Darling-Hammond, L., Wei, R. C., Andree, A., Richardson, N., & Orphanos, S. (2009). Professional learning in the learning profession : A status report on teacher development in the United States and abroad.
|
8. I learned things today that will improve student learning. |
Targeting STUDENT LEARNING
"Research suggests that professional development is most effective when it addresses the concrete, everyday challenges involved in teaching and learning specific academic subject matter, rather than focusing on abstract educational principles or teaching methods taken out of context. For example, researchers have found that teachers are more likely to try classroom practices that have been modeled for them in professional development settings. Likewise, teachers themselves judge professional development to be most valuable when it provides opportunities to do “hands-on” work that builds their knowledge of academic content and how to teach it to their students, and when it takes into account the local context (including the specifics of local school resources, curriculum guidelines, accountability systems, and so on)"
Darling-Hammond, L. (2013). Getting teacher evaluation right: What really matters for effectiveness and improvement. Teachers College Press. |
9. I learned things today that will contribute to my school’s priorities and goals. |
Targeting SCHOOL IMPROVEMENT
"Research suggests that professional development tends to be more effective when it is an integral part of a larger school reform effort, rather than when activities are isolated, having little to do with other initiatives or changes underway at the school."
Darling-Hammond, L. (2013). Getting teacher evaluation right: What really matters for effectiveness and improvement. Teachers College Press. |
|
|
After testing the survey with some colleagues, I revised the Networking set because the existing questions lent themselves to simple yes/no responses. Rather than change the response anchors, I revised the question to fit the "Reflect ME?" anchor set. The new set is included here
- I had casual conversations with people from schools other than my own.
- During sessions, I had conversations on topics related to teaching and learning.
- In casual conversations, I shared with other people the ideas I encountered today.
- In the future, I will connect with people I met today to explore ideas in teaching and learning.
- I re-connected with people today that I don’t often get to see.
- I sought out specific people today to explore ideas in teaching and learning.
- I learned things today that I will use to improve my professional practice
- I learned things today that I will use to improve student learning.
- I learned things today that I will use to meet my school’s priorities and goals.
Also had an a question about the 7 point scale with a recommendation to use 5 to reduce distraction and increase familiarity.
The anchors came verbatim from a document offering standard Likert response sets (Vagias, 2006). For the set titled, "Like Me?" there were seven and no other response sets were given for that type of prompt. Sauro (2010) in a web article also recommends using 7 point scales for greater discrimination and to reduce response error.
While Likert scales return useful ordinal data, you cannot place too much confidence in the intervals (Fowler, 1993). One should, rather, quantify the responses to each anchor and resist the temptation to average, or interpolate between anchors.
Leung (2011) suggests a work-around with an 11 point scale which, he suggests, "increases sensitivity and is closer to interval level of scaling and normality." However, another study found that having more options resulted in greater reliability though they also said that the gains in reliability level off at five options (Lissitz & Green, 1975). While there may be greater reliability with 7, the gains are minimal.
From the respondents' point of view, I considered that the 7 point scale offers a kind of neutral-kind-of-agree and a neutral-kind-of-disagree with points 3 and 4 while 2 and 6 are more definitive statements suggesting a degree of commitment. 1 and 7 have an extreme/enthusiastic edge to them.
Given that one cannot meaningfully extrapolate between anchors, it may be useful to have this range of options. I've already printed with 7 for tomorrow's nine-question Edcamp survey. Will evaluate those results and responses and revise if necessary for our April survey. Fortunately, Dawes (2007) found that the mean scores in a test of both 5 and 7 point scales on the same survey were the same so there would still be some utility to the original survey results should subsequent measures use a the 5 point scale.
Reflecting on the data collection tools used
Graffiti Wall
This feedback tool received little attention from any of the participants. We positioned the paper near the exit of the main meeting space which was a high traffic area. In retrospect, I needed to be in a visible space but not a high traffic area. We suspect that individuals didn’t want to spend time in a space where many people were passing by and potentially block the flow of traffic. Secondly, we suspect that having some guiding questions on the page might have elicited more responses. The page simply had the Edcamp name on it and, while the purpose was explained verbally in the opening, participants may have forgotten about it, or were hindered somehow by the large blank sheet of paper (not wanting to be the first, not knowing what to say, unsure of expectations, etc.)
Scatter Plot
This feedback tool had a positive response. Seventy-four of the attendees placed dots on the graph with all but 7 rating above the half-way line on idea exchange. They were almost evenly split with 31 on the low half of engagement with people and 43 on the high half.
Survey
We placed a survey at each seat in the main meeting space so when participants returned at the end of the day they could begin responding. The response rate was 75.8% with 125 of the 165 attendees submitting the evaluation survey. In addition to the nine Likert response questions in the three domains there was a space at the bottom of the page inviting comments. Forty-one respondents included written comments which were, on the whole, extremely positive. A word-cloud of the survey results identify the most frequently used words with teachers, great, day, ideas, enjoyed, and school dominating the image. The survey results were compiled in SPSS first with all responses blended, then again separating teaching staff from education assistants (EAs) to get a better sense of whether the format served both groups. The report is embedded below and includes the narrative feedback as well as the data charts. There is more interpretation yet to be done, and the questions themselves need some revision, but the results appear to support the efficacy of Edcamps to engage participants in idea exchange, networking, and professional development.
Download the report HERE
Edcamp-Report-Final-Complete
Digital Artifacts
Less than half of the sessions had notes taken; challenges with individuals logging in to the Google Docs likely accounts for some of the low participation in that area. The notes that were taken suggest focused engagement among participants on the session topics. Tweets tended to reflect affective engagement expressing appreciation or excitement for the event rather than sharing content and learning. A compilation of these digital artifacts were collected into a Storify page embedded at the end of this page.
Observations
- EA and Teacher ratings were very close on Q1 Exploration, Q5 Maintaining Networks, and Q9 School Improvement and further apart on Q2 Investment, Q4 Building Community, and Q8 Student Learning. In general, EA selections reflected similar distributions as teachers but averaged 0.55 points lower across all responses. Anecdotal responses from EAs in person and on the response sheets were all positive, “better than I expected it would be.”
- Participants engaged in conversation and participated actively in sessions. The idea exchange reports are high and positive suggesting that participants both shared and took in many new ideas. On the way out at the end of the day, participants were asked to place a dot on a graph (Figure 1) indicating their sense of engagement with people and ideas. Almost all dots were above the half-way line for ideas indicating that participants felt they had been exposed to many new perspectives and strategies.
- Of the three broad categories, the networking responses ranked lowest. As an Edcamp goal, this area is worth some attention – helping participants see the value in connecting with others and expanding their professional networks as well as the benefits of open sharing and engagement with people in the network. While the networking results were the lowest of the three categories, they were, on average, in the “Like Me” range.
- Some 1 ratings are explained by a couple of groups who reported using the day to collaborate with building colleagues on professional tasks related to their practice and student learning. Two surveys included explanations as such for the low ratings in some areas. Such engagement would likely depress responses to questions 1, 3, 4, 5, 6.
- One comment expressed dissatisfaction about how the respondent’s invitation was received and valued. Despite the negative comment, the Likert responses for that same respondent were all very positive.
- We learned in conversation that no one responded to a particular The participant who made the invitation came to the conclusion that no one had the skills or the interest to engage with that topic. The participant then moved to another session of interest.
- The PE sessions appeared to be mostly play though the conversations during play related to strategies to develop student skills, activities and drills used, and demonstrations of technique. Several times throughout the day, those engaging in the Gym sessions would conference at the bench and share ideas.
- One of the TAA participants shared that several TAA teachers met with a new staff to help with strategy and program planning as well as instruction on some equipment used in the TAA lab. The participant reflected, “I don’t know when we could have ever done that. It’s rare that we are together, especially in our labs.”
- The music teachers also created several spaces and themes to meet their own needs. Interestingly, many of these specialty teachers also broke from their own groups to attend sessions or engage in conversation elsewhere. This appears to be a strength of the Edcamp model – offering a diverse range of content with opportunities for content specialization as well as idea exploration.
- Extreme caution must be exercised with this data due to the combination of the survey’s brevity and wide scope. With only one question targeting each of nine items, there is a large margin of error, not least of which may be the validity of the question itself. However, the data gathered does suggest participants found value in the experience. Based on anecdotal comments included on the survey, there was also widespread satisfaction with the Edcamp format. Over 80% of the responses for all categories and questions fell on the “like me” side of the scale with more than 65% indicating “True” and “Very True” responses in all categories.
Conclusion
Given the strong positive responses to each of the nine survey question in the three domains, it appears that participants feel that Edcamps foster idea exchange, promote professional networking, and offer opportunities for professional development. Further analysis of the data and another test of the survey with a different group will afford opportunities for testing the survey's validity and reliability.
Traditional PD has an expert talking at you and you listen trying to make connections. There is no (or little) dialogue, and the relationship between presenter and attendee is imbalanced. One is the giver and the rest are receivers.
Edcamp equalizes those relationships. Many of the survey comments for Edcamp MY7Oaks talked about the conversations and dialogue and idea exchange. As a participant you give AND receive; you are a contributor to the bank of knowledge and agent in growing skills in others as well as being able to address your own goals. It is rewarding and engaging on both fronts.
One verbal comment suggested that we are often exhausted from traditional PD and find it stale and uninteresting by the afternoon. Edcamp, she observed, was the opposite – she felt invigorated and excited for the afternoon. Could this be because we are not simply recipients; that the act of giving, contributing, feeling valued for your experience is rewarding?