Posts
Week seven of CEP811 is waning, and this has been a very busy week for me. I was in San Antonio Tuesday through Friday for a conference and then came back with a cold. Luckily, this week was manageable because of all the reading I needed to do. Below is a collection of articles I found through the MSU library that focus on inquiry learning, science education, and using digital tools to accomplish those tasks.
Article 1
Bell, R.L. (2005). Whole-class inquiry: science. Learning & Leading with Technology. 32-8, 45-47. Retrieved from the ProQuest Research Library.
This article considers three modes of instruction in a science classroom: textbook, hands-on using technology, and whole-class inquiry. Teaching a textbook gives concise, but narrow explanations of the concept to be learned. Hands-on work is more effective, but relies on careful planning and pre-instruction from the teacher. Whole-class inquiry increases engagement and allows students to build knowledge through shared experience.
A former teacher, Bell recognizes that the research in deploying new technologies in schools has not been done (at the time). He recognizes that appropriate questions and the data are going to inform best practice in the future. His experience as a science teacher informs his methods of instruction for pre-service teachers.
Article 2
Horvath, L.C. (2008). Tangled up in inquiry: Documenting pre-service teachers perspectives on inquiry as they reflect on the process of planning and teaching inquiry-based lessons. Ann Arbor, MI: UMI Dissertations Publishing.
This dissertation studied pre-service science teachers’ perceptions of inquiry learning both before and after teaching an inquiry-based lesson. 13 teachers were followed and 84 distinct characteristics were compiled by the study. 10 of the 13 teachers interviewed showed significant shifts in their perceptions of inquiry learning. Common characteristics found in the study included students gathering and analyzing data, problem solving, group work, and asking questions. The author noted that inquiry instruction being included in pre-service training would be beneficial during student teaching.
Article 3
Tessier, J. (2010). An Inquiry-Based Biology Laboratory Improves Preservice Elementary Teachers’ Attitudes About Science. Journal of College Science Teaching. 39-6, 84-90. Retrieved from the ProQuest Research Library.
This study looked at pre-service elementary teachers during their college biology class. The author was interested in student satisfaction in their experience of inquiry-based learning and the likelihood for the students to use similar methods in their teaching. This was done in response to the loss of time in elementary science classrooms. A statistically significant portion of students new to inquiry learning said they would most likely use the method in their own classroom. The author suggests that inquiry-based learning should be a part of pre-service teacher training.
Article 4
Padilla, M. (2010). Inquiry, Process Skills, and Thinking in Science. Science and Children. 48-2, 8-9. Retrieved from the ProQuest Research Library.
This brief article noted the differences between “inquiry” and “process skills.” According to Padilla, the two are often confused by teachers. He says that inquiry should include indicators such as students asking questions, designing procedures, collecting evidence (data), forming explanations, and describing the results. Process skills can lead to inquiry, but are not synonymous. He suggest teachers improve their questioning but also encourages silence from the instructor to encourage student thinking.
Article 5
Cartier, J.L.; Stewart, J; Zoellner, B. (2006). Modeling & Inquiry in a High School Genetics Class. The American Biology Teacher. 68-6, 334-340. Retrieved from the ProQuest Research Library.
This is a case study from a high school genetics class which used inquiry-based learning to help students learn concepts about genetics. The authors adapted their current curriculum to guide students through the process of uncovering genetic principles. The authors also stress the importance of developing a “scientific community,” in which everyone is a member and helps construct knowledge. They state that the inquiry method would not have been successful without building the community of learning first.
Reflection
I need to admit right up front that I’m a “just Google it” person by nature. That’s where most of my searches begin. (That being said, the Google Scholar resource is pretty awesome.) Libraries are such a great resource, and having been out of school for a while, it is nice to be able to access research articles that are typically behind paywalls or subscription services. The ProQuest database and ERIC were extremely helpful, as were the search tools. I started with keywords like “science education” and “inquiry learning” and then refined from there. At one point, I did have a question about obtaining print materials, so I hopped in the 24/7 live chat and got an answer right away. Unfortunately, I didn’t get the request in early enough to include the article in this post, but I’m excited to read it.
This post is a revision of the original experiment I posted two weeks ago. The main purpose of this is to add more elements of Universal Design for Learning and to elaborate more on the process used to help students build their own understanding of speed based on experimentation.
Additions
This activity will have a larger scope than the immediate physics relationship. Students will work with their biology (and health?) teachers to study human physiological reactions to activity. Things like heart rate, muscle fatigue, breathing patterns, etc can all be studied. Students will be asked to take factors like exercise patterns, sleep habits, and nutrition and evaluate their effect on physical tasks. The bicycle can then be used after a period of experimentation to take new data and draw conclusions.
To address the process of encoding and decoding graphs, I’ll be adding an activity from David Wees, a math teacher who often does experiments with web tools being used to teach through inquiry and games. Not long before I wrote the original experiment, David shared an interactive graphing game that I referenced, but didn’t pay much attention to. The player is asked to move a stickman in such a way that a real-time graph matches a pre-determined line. The graph is labeled and clearly shows the effect of any action in the game. Students can use this to form explanations of the components of graphs and how they relate to one another.
This leads into the bicycle hooked to the Raspberry Pi. The parameters are similar (distance over time) but we’re adding the physical act of pedaling as well as the physics component (speed) as outlined.
Reflection
I have to admit, this re-write is challenging. The components of UDL all seem to focus on choice, multiple means of acquisition and sharing, and multiple opportunities for learning. Rewriting an activity to include more components of UDL by adding parameters seems to be counterproductive.
That being said, my original plan did not do a whole lot to support the task of reading and creating graphs, and I think the addition of David’s stickman game will address that problem. I also think this was more an exercise in writing clearly than it was about incorporating principles of UDL. My original intent was to have simple prompts with multiple points for experimentation, assessment and revision, and I think that has been maintained (for the most part) in this update. Perhaps the wider picture is something I envision frequently, but communicate rarely.
My teaching has always focused on openness…BYOD, open Internet assessments, open-ended assignments…I think all of these things are supported by the UDL framework and are not things I articulate in new lessons. Science is a story…exploration and experimentation help us navigate that narrative. This entire activity is designed to have students do something they’re familiar with and apply it to a new idea.
Standards and goals for activities are good guides for learning, but too much of a focus on how to get students down that path robs them of authentic opportunities to experiment and defend their ideas. Rather than approaching UDL as a checklist for lesson design, we need to look beyond the components and find ways to promote the ideas they represent. Do we need a specific line in a plan that says, “Students will create an online resource for [fill in the blank]?” Or, should we allow them to come to us with the ideas for sharing and support them in that goal?
The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ but ‘That’s funny…’
Isaac Asimov’s words hit home (thanks David Grossman for sharing!) I would argue that we replace “science” with “learning.” It doesn’t happen by having a section in a lesson plan for “provoking sustaining effort or persistence,” and achieving that mindset takes a serious mental shift for the teacher (and student) to achieve.
All this to say: we need to focus on providing the means to support multiple opportunities for students to learn in their own way. I don’t want to worry about what each student “prefers.” I’d rather be open enough so that each can go his or her own way and be successful.
I’ve been ensconced in Chrome lately. I worked with the team at TechSmith who brought Snagit to Chrome and now, I’m researching different ways to make learning more accessible through Chrome apps and extensions. I’m not going to get into the argument of why it is or isn’t a big deal. I’ll just say that yes, I think it is a big deal, but save the why for another post.
Last night, Google published an extension that brings Google Now-like voice searching to Chrome. It runs in the background and lets you use the “OK Google” trigger to search for anything.
You should grab the extension to try it yourself.
Like I said in the video, other than feeling like I’m in Star Trek, talking to my computer and having it talk back, what implications does this have for learning? If students can do a search for anything without even using their hands, it should really change the way we think about technology in classrooms.
What ideas do you have?
Design is the application of intent – the opposite of happenstance, and an antidote to accident.
The quote I open this post with is from Robert L. Peters, a designer, thinker, and professor originally from Canada, but teaching globally. This is especially apropos because of the task this week, designing a classroom, and because this is something I thought about constantly while teaching. Design influences frame of mind, expectation, and ultimately, behavior in any given space. Schools, in my opinion, haven’t paid enough attention to design, which is why we struggle to accomplish collaborative learning or inquiry-driven learning goals.
Greg Green is the principal of Clintondale High School in Detroit. Greg and I spent time together working with a team on the Four Pillars of Flipped Learning. We were tasked with explaining and classifying Flexible Environments. Greg and I talked about the types of spaces that must be present in any classroom to support all types of student learning needs. Because I am not currently in the classroom, I took some creative liberty and designed the ideal classroom space based on my discussions with Greg. The four areas we identified were:
- Individual space
- Group (collaborative) work spaces
- Small group instruction
- One on one instruction
Obviously, these could be accomplished in any variety of ways, and I approached it through an intentional floor plan, furniture, flow, and available resources.
Each area of the classroom is designed to meet a particular need. It is also easy to get up and move around as needs change throughout the course of a class. A student could begin working individually, but easily move to another area of the building to join his or her group, or get some extra help from the teacher.
A main argument for the need of varying learning spaces comes from Howard Gardner’s Multiple Intelligence Theory. According to Garnder, people have cognitive strengths they pull from to solve problems (Brualdi 1996). We’re all familiar with classifying students as “visual” or “kinesthetic” learners. However, there is no evidence suggesting that focusing on particular facets of intelligence described by learning style theorists helps students succeed (Riener, Willingham 2010). Design as an effort to meet these multiple intelligences is shortsighted and doesn’t address the needs of learning as a practice.
For such an ambitious undertaking, the entire community would need to be involved. School leaders, teachers, designers, parents, and most importantly, students, should have a say in the way the space is structured and implemented. More often than not, students know their comfort zones and how to address their own needs. Without the correct space, for example, a student can create a space of individual learning by using earbuds in a noisy room. It is the task of the teacher to help students evaluate what kind of environment is most conducive to learning. The space should support the behavior we want to see in a given situation.
Placing a cost estimate is nearly impossible because of all the variables involved. We can go for top of the line digital tools and put the cost into a prohibitive zone, or we can evaluate the needs of the space and work to solve those needs effectively. Aside from furniture, this learning space only has large televisions for presentations or discussion on digital media. Multiple whiteboards are included for on-the-fly collaboration, problem solving, or brainstorming. There are no computers in this space because learners are often using their own device(s). Comfort is important, and the task of learning an unfamiliar tool can often get in the way of focusing on the work being done.
Large projects are very difficult to implement all at once. With this particular project, I think the mindset of what schools should look like and do will be a major barrier. Schools in their current form have been around since the early 20th century. The system of compartmentalized education is such a part of our culture, that a shift in a direction that gives students freedom and choice in their learning path is a major uphill battle. If we can begin talking about schools the same way we talk about libraries and community centers, design change will follow close behind.
I’ll admit right at the beginning that this post is a shameless use of all facets of my network. This blog is one of those. So, if you’re someone who doesn’t like it when people do that, you can stop reading, I’m sorry. But, I do ask that you give me a shot.
Thanksgiving is upon us, and customarily, people are sharing out their quips of thanks for the season. Some go through each day and give one thing they’re thankful for.
I think this is something we need to do more often in education. It is very easy in today’s climate to get beaten down and complain about the things going wrong in our schools. I count myself in that group. There are a lot of posts on this blog in which I extol the adversity in my classroom and building. However, I would like to invite everyone to share something they’re thankful for in education. There are more wins out there than losses, and I want to make those as public as I can.
If you’re not familiar with what I do nowadays, I work with TechSmith Education. Part of my job is to host a weekly podcast on the EdReach Network titled Chalkstar to Rockstar: Revolutionary Ideas in Learning. I get to share out stories of teachers doing amazing things in their classrooms each week, and I’ve had the chance to interview some amazing people.
Next week is the Thanksgiving episode. It goes live on Wednesday, Thanksgiving Eve, and I want to share as many stories of thanks in education as I can. To do that, I need your help. Please take 30 seconds to fill out a two question survey, of which only one question is mandatory. I’ll be sharing all of the responses on the podcast as well as an accompanying blog post. If you could fill out the embedded form below and then pass it along, I’d be much obliged.
You can also share on Twitter using #eduthanks. If you want to pass the survey, you can use http://bit.ly/eduthanks.
For me, I’m thankful for teachers who continue to fight the good fight against overwhelming odds. You all are an inspiration daily (and I’m not just saying that). Happy Thanksgiving, everyone.
CEP 811 is steaming forward at full speed and we’re now getting close to finishing week four of the course. This week, we’ve been tasked with creating an outline for a MOOC. After many days opening a new blog post and staring at it, I think I’ve finally landed on a format and topic. So, without further ado, I humbly submit for your consideration…
In I’ll Take it to Go, my peers will explore mobile creation skills by working only on mobile phones for the course and through open communication, feedback, and remixing by peers.
Course topic: facilitating active learning on mobile devices.
Students are coming into schools with mobile devices which are not being utilized for a variety of reasons, one of which is not knowing how to effectively engage students in higher order thinking skills. Often, mobile apps and tools are dismissed as only having entertainment value. We are missing a huge opportunity to leverage the computing power in their pockets.
So, the question is, “Why mobile devices?” Consider the amount of time you use your device each day. Directions, research, quick communication…all done on the go. We capture moments through photos and video, we share our lives with one another as we move from place to place. These simple (and often free) tools can be repurposed to support students and the learning process. Nearly all students have experience with mobile devices, so the time spent teaching complicated tools can be eliminated. Remember, Cognitive Load Theory states that learning can only occur when the student can apply sufficient working memory resources (Sweller, Merrienboer, and Paas, 1998). Too often, new tools command the student focus rather than the learning task given. By using familiar tools, accentuating process and encouraging connections, the course will push learners into higher-order application of ideas and skills.
This is meant for all educators and students. Tools that can be used by students can (and should) also be used by teachers and other staff to engage, encourage, and support learning. This won’t be a typical MOOC. The course will be decentralized and focus on skill building and innovative application of mobile learning techniques. Learning targets will have suggested tasks to complete, but participants will be able to network, explore, and create their own products for completion. Peer evaluations will be used as benchmarks for progress through the course, and the course can be taken in any sequence. That being said, the length of the course may vary from one person to another.
Participants in the course will be expected to use their mobile device to create a history of artifacts to demonstrate their learning. Areas of focus will include photography, video, audio, social media, and blogging. While all tasks can be done on a traditional desktop or laptop computer, the main objective of the course is to immerse learners in the world of mobile tech so they can bring their experiences back to the classroom to more successfully engage their students. The time it takes to complete is partly determined by the depth of exploration that occurs within each topic and the resulting peer assessment, revision, and remixing. There is no prescribed “time on task,” and learners will have an opportunity to explore ideas as in depth as they would like.
Putting it together
The majority of MOOCs focus on using the Internet as content delivery…a large pipeline through which information can be delivered from one person to thousands. The problem is that the Internet doesn’t work like a pipe. It works like a network, with information criss-crossing from one person to another. If we want to design effective online classes, we need to build courses to mimic that network. As long as MOOCs focus on technology (the LMS used for delivery) and the content (top-shelf professors), their design and effectiveness will continue to suffer. Pedagogy must has as much importance as the others, if not more, in order to truly innovate in online education.
I watched a TEDxBeaconStreet talk the other evening entitled “Reimagining Learning.” It started off well enough, with some good points about the challenges of teaching in a digital age. I really liked Richard’s opening point:
There’s a more serious digital divide that we face in this country. That is the divide between those who know how to use technology to reimagine learning and those who simply use technology to digitize traditional learning practices.
Not too bad, consider I’ve even written about reimagining schools through Flipped Learning.
He then made some jokes and quips about scanning photos and using projectors as really fancy chalkboards. Ha ha.
He argue that the way to really change schools is to personalize learning. Again, something I can get on board with.
And then he dropped this bombshell:
< crashandburn >
My heart fell. There are so many things in this story that put Richard, in my mind, solidly in the camp of “digitizing traditional teaching practices.”
The students walk in every day and they see on these screens, their names…and they see where they’re supposed to go to learn that day.
I don’t know about you, but the first thing I want my students to see when they walk in is me, smiling, welcoming them back to the room to learn together. Step one in this case is digitize the teacher.
And then they go, like this group of girls right here, and they learn whatever they’re doing. At the end of the period, they stop a few minutes early, and they take a quick three-question test.
Their performance goes into an algorithm that customizes their schedule for the next day.
Rinse, wash, repeat. (And, I bet if a teacher were around in that picture, they could tell you what the girls were working on that day.)
He then goes on to talk about MOOCs (attributing the idea improperly) and how “reimagining learning” is really just opening it up to hundreds of thousands of people. No mention of the massive attrition rate of students nor the fact that MOOCs aren’t solving real problems in higher education.
I think I’ve come to the conclusion that most of the widely-publicized talks on education are either 1) given by people with lots of money, or 2) given by people who want to make lots of money. There have been very few compelling TED talks lately that have really communicated some of the major change that can come to education when we really think hard about what technology can help us do.
I’m not saying there aren’t any. Ramsey Musallam’s “Three Rules to Spark Learning” and Kristin Daniels’ talk on reinventing professional development are top notch. I’m convinced they are because they’re teachers. Not venture capitalists. Not entrepreneurs. Not CEOs or filmmakers.
Maybe I’m just watching the wrong talks, but I know that I’m waiting for TED to look past the hype and bring back some great ideas.
Another post in the series for CEP811, we’re really getting serious now as we begin to develop potential plans for our maker kits.
Last week, I wrote about a potential activity using an old exercise bike and a Raspberry Pi hacked together. (It even had a super-fancy animated GIF as a bonus.) In short, the idea was to have the students pedal an exercise bike, send some data to the Pi, and have it graph (in real time) the student’s speed as a function of time.
A lot of this project comes from my longing for a better experience with physics and math in high school. Both were drab, disconnected, and frustrating for me. Since joining Twitter in 2011 and following people like Frank Noschese, Dan Meyer, and Ramsey Musallam, I really wish I had an experience like what they give their students.
I want to focus on one theory in particular: Cognitive Load Theory (CLT). According to CLT, working memory constraints are the determinants of instructional effectiveness (Sweller, Merrienboer, and Paas, 1998). The authors break cognitive load into three types of “load”: Intrinsic, Extraneous, and Germane.
Intrinsic load is related to the nature of the content being taught. Extraneous load is related to the instructional methods and conditions, and germane is the formation of learning schema (Sweller et al., 1998). Tasks with low interactivity contain elements that do not interact with each other, can be learnt in isolation, and require relatively low working memory load (Ayers, 2006). A high working memory requirement comes from tasks that have multiple interacting elements that need to be learned simultaneously rather than in isolation (Ayers, 2006; Sweller, 1999; Sweller & Chandler, 1994). In addition, Marcus, Cooper, and Sweller (1996) state that understanding “is applied only when dealing with high element interactivity material.”
With this in mind, my activity is designed to reduce the cognitive load placed on students as they explore the concept of speed using an exercise bike and a Raspberry Pi.
In order to introduce speed, students need an understanding of how to graph. (I deliberately use the term “understand” here because of the relationships required to produce proper graphs.) Sweller et al. (1998) suggest students that may not have this content processed automatically in existing schema could experience a high cognitive load on the wrong material and be unsuccessful in the goal of the activity.
At the start of the activity, students will be asked to pedal an exercise bike for a period of time. They will not be given direction on how fast to pedal because the second part of the activity will ask them to analyze their graph. The Raspberry Pi will automate the graphing process so students can focus solely on the task of creating a working definition of “speed.” Students will also have an opportunity to repeat the experiment as often as needed in order to confirm their result.
This activity can also be used to introduce the idea of average speed in relation to instantaneous speed. The analysis of the graph will ask students to plot a best-fit line in order to report the average speed of their trial. Typically, this activity is done where students take all the data, create the graph, and then attempt to draw conclusions. I am automating data collection and graphing so students can focus on coming to the correct conclusion rather than filling their working memory with procedural components.
Materials for this activity are difficult to produce because of the exploration that students need to do. By deliberately witholding information and direction, students are more likely to take risks and form hypothesis that can be tested further throughout the class. Science is all about exploration and hopefully, this activity will allow them to explore freely.
In the future I hope to incorporate more ideas around inquiry and perplexity, but that will have to be in another post. For now, consider this TED talk by Ramsey Musallam on the unique opportunity we have every day to perplex and engage students in critical thought and exploration.
I went to Goodwill this evening with Lindsey and Meredith. I had wanted to go for a while, and after not finding much in the clothing, I turned towards the assorted gadgets in the back to hunt for some cool toys for this assignment.
As luck would have it, there was a great old exercise bike there.
It even had a working pressure dial and speedometer on it.
I bought a Raspberry Pi for the course and I’ve already started working on a project coding in Python and using my telescope. There is a ton you can do with some cheap switches and circuit boards, so I thought it would be cool if my classroom (someday) had a way to introduce graphing using a manipulative. I have to admit I was in a frame of mind for graphing for a couple of reasons.
First, Dan Meyer had a blog post rounding up some great classroom action he saw in the blogs this week. One was referencing novel ways to introduce students to graphing. Dan posted a quote from the original, which I am reposting here:
…a comment laced with negativity that resonated with Lauren and me was an outburst that “graphing used to be so easy, and this just made it hard.”
The second reason I was thinking about graphing this evening was because of a link from Ramsey Musallam to an interactive graphing game by David Wees. I spent a good time playing the game, learning, experimenting, and working to connect the physical act of moving the stick figure to the way the line was being drawn.
So, I came to this idea: Students could ride the bike, which has a controller hooked to the Rasperry Pi, to create a graph velocity for the time bike is pedaled.
In order to get the tachometer on the bike to talk with the computer, you’d need some kind of controller.
Process
- Take the backing off the exercise bike tachometer to mount the electric switch.
- The switch will need to mount inside the casing somehow. You would want it to make contact each time the gear rotated once. This could be done by mounting a trigger arm on the gear to contact the switch to complete the circuit.
- Run the lead from the switch to the Raspberry Pi. I’m not sure if you would need some kind of intermediate step here before it feeds to the computer. I’m still researching.
- A simple Python script on the computer would count the number of times the switch is activated for a given period of time to calculate the RPM value.
- The value would be given in a graph vs time as long as the bike is running.
I’m still learning python, but you could start with this snippet of code to get the momentary velocity.:
`r = raw_input(‘Radius [meters]> ‘)
RPM = raw_input(‘RPM> ‘)
rad = float(0.10472)
v = float(RPM) * float(r) * rad
print v," m/s"`
By wrapping this function in a `while` loop, you could probably create a pretty nice graph for the time the student was riding the bike. You could then even take it into an experiment where they measure the change in velocity as more resistance is applied to the wheel.
### Resources
Yenca, C. (2013, October 31). Giving graphingstories.com a go. _mathycathy_. Retrieved from
October was Connected Educator’s Month. (Soapbox moment: I can’t wait until we can get rid of these silly “awareness months.” Okay, I’m done). I’ve been on Twitter for almost three years and I was curious about what engagement actually looked like for a given period of time. So, I decided to do a little experiment.
Gathering Data
On October 1, I turned on all of the notifications for Twitter except for DM’s and Replies. Those are easy enough to count with a tool given to me by LivingTree called Twitonomy. I was more interested in two things:
- How many RT’s and Favorited tweets would I get
- What kind of tweets were favorited and retweeted.
Unless you have email notifications (which I hate) turned on, you don’t get to see your RT’s and favorited tweets in the 3rd party Twitter clients. (Sidebar number two: you should use a management app. Twitter on the web is horrible.)
The Process
Like a good scientist, I had a couple of controls. First, I didn’t tell anyone I was doing this. I didn’t want to sway the normal activity of my normal interactions. So, for those of you who unwillingly contributed, thank you. Second, I didn’t change my habits of tweeting. I tweeted jokes, nonsense, commentary, snark, resources, articles, pictures…pretty much business as usual. Again, I didn’t want to saturate my stream with some kind of bias for results.
I broke my tweeting habits up into four groups:
- Commentary
- Resources
- Blog Posts
- Other
Commentary – This was off the cuff comments, statements, snark, or other general statements about what I was thinking at the time.
Everything Sal #Khan talks about is nothing the KA system does. #facepalm
—Brian E. Bennett (@bennettscience) October 1, 2013
Resources – This is anything like how-to’s or other informational pieces (not written by me) that might help others.
OK, two tools thanks to @livingtree and Google: http://t.co/6nZMTcFJBo gives stats, has paid option for even more details (ex custom dates)
—Brian E. Bennett (@bennettscience) October 30, 2013
Blog Posts – These are tweets for any post that I have direct control over.
New post today: A “Radical New Teaching Model” That is Missing the Point http://t.co/Y7MkeBFr6c
—Brian E. Bennett (@bennettscience) October 29, 2013
Other – Goofy articles, mostly. Things not pertaining to anything other than for entertainment value.
The 44 Best Pictures Of Scared Bros At A Haunted House Of 2013 http://t.co/GvMsG1Pi3P < totally cheered me up.
—Brian E. Bennett (@bennettscience) October 1, 2013
The Data
I went through all of the interactions and put them into a Google spreadsheet to visualize the data a little bit.
For the mentions, I decided to subtract my reported (via email) RT’s from my mentions count in order to get a more accurate number. I used two different means to get these numbers, so they’re probably double counted.
I went into this expecting that articles and blog posts would draw the highest level of engagement on Twitter. I was really surprised to see that my offhand comments were the most interacted-with.
And I think part of my surprise in the results is because I didn’t set up an easy way (at the beginning) to track the interactions (specifically, Mentions) for each individual tweet. All I measured was the number of times a particular tweet was RT’d or favorited, which isn’t really interacting at all…at least not in the traditional sense.
The Deep Stuff
Twitter is a funny machine…it allows people from across the world to interact with one another, but not under the obligation of actually interacting with them. I had no idea how many times particular tweets were retweeted by followers. So, while they were resonating with something I had said or shared, I wasn’t aware of that interaction. So, the question going through my mind right now is:
“Is it my fault for not interacting and engaging a follower, or their fault for not reaching out and engaging with me?”
I decided to create a really official sounding, yet totally made up, metric for my engagement in October. I’m calling it the Engagement Quotient. (Sounds good, right?) I got this by dividing the number of Mentions by my total Tweets for the time period.
To get this, I wanted to know what percentage of my Tweets encouraged some kind of direct response from someone as a Mention.
Out of 689 tweets in October, I received 542 mentions, giving me an EQ score of 79%.
In reality, this means absolutely nothing. But, to me, it means that something I’m doing is engaging my PLN. And that means I’m contributing to the discussion in some way, which makes me feel good about being a connected person/educator/male/whatever.
In the future, I want to have some better goals set up from day one. A lot of this came toward the end of the month as I thought about what it means to be engaged. So, while it is mildly scientific in nature (I had tables and charts), it could have been more so. Maybe I’ll do it again in a few months and see if it changes based on season. I’m not sure.
If you want to see my data, you can check out the spreadsheet here.
Wired business had an article come out earlier this month about a small school in Mexico that saw huge gains in learning because of some computers put in the classroom. If you haven’t seen it, you can read it here.
Because Wired is wildly popular and because education is kind of a hot topic right now, this article has been making some waves. Even in the office, it made its rounds as a fantastic idea that schools are behind on. The teacher in me welled up and wrote a [STRIKEOUT:preachy] soapboxy email to the group passing it, and I figured I would post it here as well.
I’d read this article earlier this week. It’s a fantastic, heartwarming story, but I do want to make one comment (this is the teacher in me talking now)…
Gupta downplays the role of the teacher. While on the surface, this looks appealing for a lot of reasons. Students can explore on their own without the dictator at the front of the room. They can work collaboratively, problem solving and self-democratizing. This is all great, but having a teacher is still important (and that’s not because I’m a teacher at heart). All of this can be done, but the role of the teacher is to provide context for the content. Anyone can get online and look up facts about the moon, DNA, or the French Revolution. What is missing is a facilitator providing context to the flow of information we get from the Internet. There have been plenty of times students are learning the content, but then fall flat on their face when they try to explain it because there is no good way to put information into a box, nice and neat.
I saw my job in the classroom as being a content resource, yes. More importantly, I could ask the probing questions and listen critically to what students were saying back to me. I’m sure Correa was doing that in his classroom, but not brought out in the article. Consider the number of times you get up and talk to someone here who knows more than you do about a task you’re trying to accomplish. You can read StackOverflow and forums all day and still not be able to accomplish the task. It’s the same thing internships are set up to do. We don’t call each other teachers, but we all teach a little bit each day.
I’m not trying to rant against the article…it’s a fantastic story of success against really stacked odds. I’m glad to see stories like this getting press and some time in front of people who might never hear them, but I’m worried that the trend of downplaying the role of educators is going to continue.
Ok, I’m done soapboxing. Thanks for indulging me.
Yes, computers are great at finding information. But, it still takes the wisdom and experience of teachers to put that content into context, and that’s the valuable lesson here.
The eight weeks of remixing have kicked off for CEP811 at MSU. Right off the bat, our first task was to pick an edtech buzzword and use Mozilla’s Popcorn Maker (I think they should have called it “remixerator,” but that’s just me) to make a one-minute video explaining our buzzword.
How could I resist the opportunity to describe a MOOC? (I really just like saying the word “MOOC.” Admit it. You do too.)
Here’s my final remix:
I’d never used Popcorn before and it actually took me a good while to get used to it. I’ve done a lot of video editing, so it was hard for me to not treat it like a full-blown video editor. I actually started this on Tuesday and then had to walk away for a little while to try and clear my head before I finished.
I think what I like the most is being able to search for content as well as link to content right in the media bin. It’s helpful to be able to paste a link and have the photo or GIF pop right in. I also like that links are included in the live project for attribution. It makes the whole curation process much simpler because I don’t have to try and keep track of every source in a separate space. I really tried to come up with metaphors for some of these ideas, but being limited to Creative Commons materials (and I’m not complaining) makes it hard to do sometimes.
I couldn’t help but editorialize a little bit at the end. MOOCs aren’t that hard to understand: they’re super-massive classes taught by a professor (usually using traditional means) to students in an LMS. There is little creativity and little freedom to really use the web. Essentially, they’re not doing a whole lot of innovative work, and now, some are making a ton of money off of their platforms.
Of course, the big conflict is that universities also want to make money off of these online courses. So, already, we have a conflict of interests that doesn’t really do anything to help students have a better online learning experience.
It may be summed up best by Larry Cuban in a recent Washington Post article:
Given the history of universities and colleges in the United States, chances are that many higher education institutions (non-elite and community colleges) will continue to retrofit and transform MOOCs into credit-bearing courses that will yield revenue. MOOCs will not revolutionize higher education.
Are MOOCs here to stay? I’m not putting my money on it.
I drive a really long way to work twice week.
I love what I do every day, which is why I choose to drive. To entertain myself I listen to a lot of podcasts. Radiolab, WireTap, and This American Life are favorites on the drive.
When I plugged the iPod into the stereo yesterday morning, it jumped to my entire song list at “A,” and just started playing songs in order. Rather than stopping and going right to a podcast, I decided to let it play for a while. I listened to songs Friday morning that I didn’t even realize were in my library (digital music overload, anyone?). It was pretty enjoyable and I heard some great music that I hadn’t listened to since at least high school.
That being said, I switched away after a while because of how disjoined everything felt. Have you ever listened to an album from start to finish without interruption? If you haven’t you really should. If a band is really thinking about their music, an album has a flow and a continuity that adds to the overall experience of their music. I found myself anticipating the next song on the album only to be disappointed (and sometimes even surprised) by the change of track.
It got me thinking about learning. Your class has a flow…a continuity that helps students travel through the content. Far too often as a teacher I heard, “Why is this important?” or, “When will this even matter?” In other words, I wasn’t doing a great job at helping to mold the entire experience of science around the individual parts. I didn’t have good segues or transitions at times. Other times, I jumped from one topic to another without any prelude, much like playing through your library alphabetically.
Think about how you’re interacting with your students. How do you transition? How are you painting the story of your content? Why should they anticipate the next step or reflect on where they just came from? Think about teaching as a story.
If you’re looking for a fantastic album to listen to start to finish, consider Bon Iver as a place to start.
I made some quick additions this afternoon to my GitHub repo for this project, which I’ve renamed to PySky. I had to do some research on the Pyephem license before I packaged it with my code. It turns out it’s available to distribute freely with other software as long as I provide my source, so now, it’s packaged with my simple python script.
Update 9:08 PM 10/16/13
Looks like I wrote the post too soon. I was able to spend some time tonight working on the script after my wife went to bed and I was able to get both of my original problems solved. The updated script is posted on GitHub.
I was able to find a nifty little piece of code to help me manage responses to prompts so my code is a little cleaner. I don’t have to have as many conditionals (if, elif, else) in my functions anymore, which makes everything look a little bit nicer.
As far as my coding, I’m working on a couple things:
- [STRIKEOUT:Users will be able to set their location, rather than having it hardcoded to South Bend (which isn’t useful at all).]
- [STRIKEOUT:The program needs to be able to save the location data for all lookups. I think a global variable is a good way to do this, but I need to learn more.]
Things have been crazy lately, so I haven’t touched this in a while. I’m still also working on getting the hardware I need to get my Raspberry Pi up and running. The only video-out it has is an HDMI port. I don’t have a monitor that can take HDMI, and I’m having a hard time tracking an HDMI-to-VGA adapter down. I might need to turn to Amazon for this one.
BUT, I did get another crucial piece of hardware for this little jaunt that I’m very excited about.
Of course, it happened to come on the cloudiest day of the month so far. I’ll post more pictures of the scope in a later post.
I worked a little bit tonight on my code while Lindsey got to bed early while the baby napped. I added a menu at the beginning of the program which prompts the user to select either Stars or Planets. Then, they can put their choice in.
This is a short-term implementation for a few of reasons:
- For users without the PyEphem library, I need to see if I can push a download or package the data somehow.
- I don’t want to have a ton of menus to go through
- The star library is pretty small. It’s good enough now, but I need to find another one to tap into.
- The code is “chunky.” I want to go through and streamline when I’m more awake.
I keep thinking about how much more I’m learning working on this project as opposed to when I tried to learn Python using arbitrary lessons. It’s really driving home the idea of interest-based learning and it’s not something I’m going to forget any time soon.
If you want to see and try the new code, you can fork it on GitHub.
I’ve been dabbling a little bit every day with this project and I’ve made some big changes since day 1.
First, with some help from Brandon Rhodes on StackOverflow, I got the function to print the altitude/azmiuth data for a planet when you run the script. This is still hardcoded for South Bend, but that’s where I live, so it makes sense. Down the line, I’ll make this a variable a user can use to set their locality.
Next, I found a python module to pull the current date and time when requesting the planet’s location. Since the Earth moved, it didn’t make sense to display the position based on date alone. Now, it will read that information from the computer and give more accurate results. Because I did this during the day, I used the sun as my object so I could check it’s position in my program vs other databases and calculators online. And this is where my brain started to hurt.
If you’re not familiar with astronomy (and I’m still learning) you can display position in a few different ways. The easiest (most popular?) way to describe position is using altitude and azimuth coordinates. The altitude is the angle of the object above the horizon and the azimuth is it’s angular distance around the horizon. So, if it’s position is 30o, 270o, it would be 30 degrees above the horizon looking due west.
You can also use celestial coordinates, right ascension and declination. RA is the angular distance from the celestial equator. In other words, if you stand on the equator and look up, you’re looking at RA = 0. The declination, on the other hand, is the direction north or south of the celestial plane. To me, this is much harder to conceptualize in my brain, which is why I prefer alt/az descriptors.
So, back to the code. I got it to print alt/az data, which was awesome. So, to make sure it was working correctly, I checked it against some other tools, and that’s when I ran into problems.
So, I went back to the code and changed it to print out the RA/Dec instead of alt/az to see what would happen.
Which was better.
I need to find some way to improve my alt/az calculations. I don’t know if it’s my location data or if there need to be adjustments to conversions, but I’m getting funny answers. For now, I’ll keep it in RA/Dec because the entire point of this program down the line is to pass this data to a telescope, so it doesn’t matter which one is easier for the user to look at. We’ll see.
If you want to see the current code, here’s the current dev code base. If you’re a python coder, feel free to fork and contribute.
This is kind of a long story, but stick with me, because I’m excited about it.
Introduction
Earlier this year, I set a goal for myself to learn Python. I started on learning, but I didn’t really have any practical application for what I was doing.
Rising Action
I’ve always enjoyed astronomy, looking at the stars and planets, and more recently, trying to take pictures of them in my back yard.
I’m planning on buying a telescope in the near future, and part of that is going to include a simple motor to make sure I can aim it accurately and efficiently.
I’m also starting a master’s class in a couple weeks at MSU which is focusing on the maker culture idea and its implications in the classroom. Rather than purchasing a textbook for the course, we’ve been asked to buy a Raspberry Pi, Makey Makey, Squishy Circuits Kit, or a LittleBits kit.
And that’s when I had my stroke of insight. (I won’t be presumptuous and say genius. Yet.)
My project this fall is going to be going back and actually learning Python to create a program on the Raspberry Pi which can be used to control my telescope.
After doing some searching, I even found a Python library, PyEphem, which is a database of astronomical data put together (in part) by the Jet Propulsion Laboratory. Those are the guys that landed Curiosity on Mars. They know what they’re doing.
I know this is going to be a crazy six weeks of learning, working, and applying that to the classroom. While working full time. And raising my daughter.
Bring it on.
The Conflict
I’ve already started by looking back at the Python I’d already learned to see if I could begin to tap into what I’ve already found. I’ve got a public Github repository to hold all of the code as I write. Right now, it’s five lines of code that allow you to pick a planet and a date (even a future date…awesome) and it will tell you which constellation the planet appears in.
If you’ve got experience with Python, I’d really love to have your input as I go through the process.
I’m not expecting to have this totally done by the end of the semester, but I know I’m going to be learning a ton that I’ll be able to take back to the classroom someday.
It was one of those days, and I couldn’t pass this one up.
The Headless ds106 is in full swing, and this week is Design Week. I love design work because it makes me think hard about how to communicate ideas both subtly and artfully. You can see some of my design work from the Twilight Zone theme this past summer.
Earlier today, Rochelle Lockridge posted an article in which someone (a colleague?) created an animated GIF for a presentation they were doing for work. This animation is great (and probably beyond my own GIMP chops) and it was a cool story to see them work through the struggle to learn GIMP to produce the image.
Later, Alan posted this tweet:
This is where the shark jumps #ds106 00 Participant in @Rockylou22 3M Salon applying GIFs to science animation http://t.co/Z5DO2N5k3W
—Alan Levine (@cogdog) October 2, 2013
And BOOM. Sharks? Complicated animated GIFs of technical thingywhosits? I got to work.
I give you: “Make the shark jump you.”
In ds106, we don’t jump the shark. That bad boy jumps us.
Go make art.
Since 2011, the online resource of thousands of educational videos has been heralded as the “savior of education” and a model for flipped classrooms (I’m going to bite my tongue on this one), and now, the go-to place for “personalized learning.” There’s a whole lot of bad in here for a lot of different reasons. But, the newest piece, the idea of personalized learning delivered by Khan Academy, is dangerous.
How do you recognize an LMS? This is what I came up with:
- An LMS reports data, people reflect.
- An LMS flags poor performance, people grow through engaging members of the community
- An LMS hosts and organizes learning content, people build their content as they learn
- An LMS keeps track of grades, people couldn’t care less about grades when they’re engaged.
Remember, an LMS is a machine, nothing more, nothing less. It will only give what you put into it.
Now, back to my question. Khan Academy.
“Personalized learning” has popped up in KA promotional materials lately. (It is the phrase you use in a conference proposal to make sure it’s picked up.) The problem is that the personalized learning offered by most third party groups isn’t personalized at all. It’s actually randomized degree of difficulty. In other words, it’s a giant, adaptive test bank that feigns its way into schools under the guise of personalization. Students are still stuck in the system. They are still forced through the steps and procedures. They have no choice in how to demonstrate their learning other than the built in, old fashioned assessments. Personalization is being eroded because either companies are really good at sales and marketing, or we’re all looking for the wrong things.
And this is why Khan Academy is nothing more than a big, fancy LMS. While powerful and extremely helpful, every LMS out there locks you into their system. If you have students, assignments, announcements, documents, and assessments poured into one place, it becomes very difficult to see any reason to step away from that construct. Sure, they make the teachers life easier, but once you’re in, it’s hard to get out (mostly because of time constraints, not necessarily procedures to switch platforms).
The big difference between “traditional” systems is that the teacher was in control of the content. Not so with Khan Academy, and this is why it’s more dangerous than the others. Teachers and schools are diving into the system because of the helpful data and videos, but at the same time, they’re unwittingly sacrificing any option for a student to choose to do something different.
We’re asking the wrong questions when it comes to evaluating learning tools for our students.
While enticing, we should not be jumping toward anything that has content baked into the system. It becomes too easy to begin relying on that content as the backbone of your course, whether you “mean” to or not. Good intentions don’t count when students’ interests are sacrificed for the sake of simplicity.
What are you really using the LMS for? Too often, a LMS is a one-way communication tool with students simply uploading materials to turn in for grades. What limitations are place on them when it comes to choosing their learning opportunities? What options do students have to ask insightful questions and then find resources to report those out? What kind of content can be brought into the system to be used as a resource by anyone else in that system?
Ask yourself when you plan on allowing students to truly direct their own learning. If you can’t come up with a reasonable answer, ask yourself “why not?”
This summer, I survived the ds106zone. There was a creative blitz that I’ve been wanting to carry through the fall. But life (literally) has come into my life, and I haven’t been able to carry the momentum.
Right now, the ds106 community is tracking through a self-inflicted “headless” course. No professor, no grades, no expectations. It’s pretty fun to watch the flow coming through Twitter, and I wanted to take a few minutes today to participate in the visual work they’re doing.
The assignment was a photoblitz: 15 minutes of photography around certain themes. I was able to do about 10 minutes here at my desk during lunch (not counting the writeup). In fact, I didn’t even stand up. It was all done from my chair on my phone, because that’s all I had with me. My five photos are below.
You can see other photoblitz submissions on Flickr.