Friday, June 30, 2017

I hate being the curmudgeon in the room, but...

I was preparing for the summer term, looking up info on our LMS's support site, and a demo on how to use Pinterest and Flickr for curation was being pushed on the top menu with a flashing link. I clicked on it and my initial reaction was "What the hell? I spend hours copying and pasting questions, importing questions into Question Banks, trying to determine which questions are in which bookmarked question bank that I can't edit, all while trying to make sure my students know what they are supposed to. And they spend time on this?" Needless to say I was pretty indignant, this felt like creating climate control features for the people in the front of the car, while the rear passengers don't have seat belts. The LMS was paying attention to bells and whistles (as all edtech does) and not getting to the simple stuff (as all edtech does).

I went off a little half cocked, not realizing that this demo was being put on by an individual through the LMS's site. So yes, I have a little egg on my face and troubled another faculty member who, if they're anything like me, don't need additional stressors on their life. I apologized and cleared things up, hopefully, to resolve any perceived antagonism. The point remains; basic functionality of curating questions in the LMS is clunky, non-intuitive, and time consumptive. Moodle has had this functionality for at least a decade. This new LMS has to build everything from the ground up with the funds they have allocated for development. They choose to focus on making vocal groups happy with bells and whistles, but neglect core functionality of a LMS.

There was a suggestion of using an outside tool, and I can use our online homework system for everything Canvas does (and have), but because other faculty use Canvas, and to ensure there is consistency of the student experience, I need to use Canvas. Would I like my students to be adaptable 21st century students, who can use different online tools? Absolutely. Are some of my community college students food insecure, and can only access the internet from campus or a library? Yes. I have to choose my battles as I live in a system larger than myself.

The person putting on the demo then suggested I use an outside installation of Moodle as a 'LEARNING' tool... What? I need a consistent place to have students answer questions, reflect on their answers, and record their attempt for a grade. I need to have them as part of a grade or students won't do them. Yeah, I'd love it for students to do what I ask of them in the interest of learning, but the maxim of "Students will only do something if it impacts their grade." is one I am burdened with and won't be able to change anytime soon. Sure, maybe I'm feeding into the learning-as-point-scoring approach to grades, but again I live in a system larger than myself. If you want to be the magical unicorn who can move mountains and lead your students to the promised land, great. I can't right now.

This person later wrote a blog post and I'd like to respond to a few points.

  • My complaints are more than just "not getting what I want", it is the claim the the LMS is being irresponsible with its time and money by focusing on bells and whistles that I am sure will help some vocal faculty, but not focusing on basic core functionality that almost all LMSs have. My STEM folks usually aren't vocal about their needs, and won't spend the time in the LMS's social media experience to vote on ideas. While questions and question banks may not be high on the voting ranks, we can't do our job without them. 
  • Get out of my face with this Teachers Throwing Out Grades (#TTOG) stuff. One day, sure let's talk about it, but right now, I can't curate questions in my LMS. You want to talk about a foundational assumption of higher education (awarding grades based on understanding and performance), without providing a strong argument for it and just assuming everyone is on board? Stop. I am not joining your cult. (Yet.)
  • Domain of One's Own is so broad and generic, I don't even get what you are talking about. You haven't defined it, you just assume everyone is on board, and after some research, sounds like a great tool for upper-level undergraduate/graduate level courses, not the courses I teach. Your tools are not my tools. 

I know I am coming off as an imperious curmudgeon, this is not the way I want to be with the larger edtech community, other faculty, and students. I would rather be a positive, uplifting, supportive, and challenging voice that helps to build consensus, foster debate, develop relationships, and find ways of working together. I really do. But when people in edtech pull a 'paradigm shift' while many of us are in institutions that have strong compliance cultures, students that have a multitude of needs, and lack necessary tools in our LMS, it feels like the privileged are asking the unprivileged to meet expectations we have no ability to reach.

Friday, September 9, 2016

NEA Higher Education Advocate is actually pretty useful!

I am sure many of you get the National Education Association's little 'magazine' Higher Education Advocate. It is usually filled with either the higher education equivalent to fluff pieces on the local news, or legal/political issues that effect higher education. However, if you pull your September 2016 issue (Vol. 34, No. 4) out of the round filing cabinet, in the Thriving in Academe feature you will find an article by James M. Lang titled Small Teaching: Lessons for Faculty from the Science of Learning. Lessons for me? Based on actual science? Get out!

Really though, I feel like education today is similar to alchemy in its final stages; lore, superstition, and patterns that were not rigorously tested, and a new challenger approaches in the Enlightenment and the scientific method. Much of what I have read is based on educator's experience, and what has worked and not worked for them in the past. This is great and wonderful, and I eat it all up, but shouldn't there be an empirical way of looking at education informed by cognitive science? I know  cognition and epistemology are not everyone's cup of tea, but it seems that we have to get into it a bit to know which best practices are actually 'best'.

Lang breaks up his recommendations during four parts of the class session; right before class, opening class, the "long middle" of the class, and the ending of the class. He recommends instilling some kind of wonder or awe in the "right before class" section, which I can see working well in mathematics classes, if done correctly. Possibly a news-related result, or a classic fable like Xeno's Paradox, doubling rice grains on a chess board, etc. In the opening of the class he recommends asking students what they did in the previous class, in my mind activating prior knowledge, and building connections to that day's material. I do this in my daily Quizzes, but he seems to make it a bit more conversational. I like that, but unsure if I can squeeze it in my current setup.

During the middle of the class he recommends some notebook thing, I don't know, it didn't seem all that useful. What made an impact was his suggestion about the end of the class; a one-minute 'paper' answering one of the questions "What question remains in your mind after today's class?", and "What was the most important thing you learned today?" These reminded me of the 'Stickiest Point' question one of my tenure advisers suggested I use, and I have incorporated them into my post-class quizzes in our learning management system. The first question makes good use of a student's recall ability, points to areas an instructor could address next class, but is also broad enough that a student could ask how the ideas of that class connect to past or future classes. The second question also has students practice recall, but also asks them to summarize the content in their own words or possibly describe something new they learned about themselves.

Granted, all of the questions have a fatal flaw; a student could answer any of them with two to four words. In my post-class quizzes I award credit based on completion, and thus these students earn credit for poor answers. I do push that in these types of reflection questions you only get out what you put in; by taking the time to reflect on the question and your answer, you provide yourself another opportunity to think and learn about the material.

Coincidentally (or not) this article's title is also the title of his book, which will go on my long cobweb-collecting reading list.

What do you think? Did you read the article? Are you going to read his book? What are small things you are trying this term?


Thursday, September 8, 2016

Pre-Fall Term Psychic Exorcism: Statistics class ideas on a page

Past two weeks between summer term and the faculty work week has been spent packing (we bought a house!), cooking a lot of good food, watching Star Trek: TNG, and reading a variety of books and articles meant to 'help' my teaching. Not sure if they are helping right now, I just have a lot of ideas floating around in my head that I need to put somewhere, namely here.

  • I've been browsing Technology-Supported Mathematics Learning Environments 67th Yearbook (2005) and while focused on a K-12 audience, I have taken up a few ideas from it:
    • Teaching Strategies for Developing Judicious Technology Use by Ball and Stacey helped address my concerns of letting students run amok with calculators (mathematical totems I call them in class). They suggest, as is a common theme with many education best practices, that we have to model how to use technology tools. And not just their actual use, but whether to use them or not. I am hoping to incorporate some of the strategies below into my in-class activities, through question prompts, discussions, or demonstrations.

    • Comparing Distributions and Growing Samples by Hand and with a Computer Tool by Bakker and Frederickson focused on middle school students and their conceptual development of data, samples, population, and measures of center. This passage in particular struck me:
      • We can compare this situation (focus on calculation of measures of center) to the proverbial tip of the iceberg. It is the substance beneath the surface that makes the iceberg float. In this metaphor, mean, median, and mode are the visible tip of the iceberg. What is beneath the visible surface is the knowledge and skills that students really need to understand and sensibly use these measures of center. 
      While it is necessary to be able to do the computations, it is much more critical for students to develop an understanding of when these measures are appropriate, and then apply and use these numbers in context. This reinforces the need for students to write these numbers in context, and base decisions on them. My in-class activities should always contain summary questions that ask students to write in words what their calculations are, and what they mean.

  • Student Engagement Techniques: A Handbook for College Faculty by Barkley offers a wide variety of ways to get students engaged with course material. One I found especially useful was the two-page section on "Try to rebuild the confidence of discouraged and disengaged students." Teaching statistics usually means teaching a student population that has some mathematical knowledge, but are not confident with that knowledge. Below is a list of strategies based on Motivating students to learn by Brophy (2004).

    I tried having students set goals in my summer Calculus course, asking them to describe what their study plan was for the weekend. I think this helped with planning, understanding consequences, and overall helped students understand responsibility. I will definitely incorporate these questions into post-quizzes for my statistics course.

    The entirety of Chapter 8: Tips and Strategies for Promoting Active Learning should be tattooed on my body somewhere. I know and apply a number of the strategies (Activate prior learning, clarify your role, limit and chunk information, etc.) but found the section on "Teach in ways that promote effective transfer." useful. I regularly refer to Bloom's Taxonomy in my classes, but the below table really hit home that your strategies and methods for developing those cognitive tasks should be different for each level of understanding.
So in summary, for my Fall Statistics course I will:
  • Include judicious technology use questions throughout in-class activities. 
  • Have questions that ask students to interpret their calculations, and make decisions based on them. 
  • In announcements and post-quizzes write comments or questions that have students think about what they have accomplished, set realistic goals, go to different support services, and to reflect on their own cognitive processes. 
  • During the start of the course, and during exam review sessions, share the Learning Strategies table and have students determine what a question is asking, at what level, and possibly how they should study for such a question. 
Glad I got those ideas out, and can now refer back to them after the term to keep me accountable. What ideas are floating around in your head for the upcoming term? Any exciting projects, new ways of doing things, assignments, assessments? Feel free to share below. 

Wednesday, July 27, 2016

Starting the Guided Pathways conversation at my insitution

I applied to and was accepted to the Clark College Summer Guided Pathways Institute. It is a four day workshop that looks to start the conversation on Guided Pathways (GP) at the college. There have been a number of readings that have discussed the data supporting the use of GPs, different models used, and how Choice Architecture can be applied to helping undergraduates choose majors and/or programs.

Overall I have been really impressed with the thoughtful and smart people in the institute. The conversations have always been positive, constructive, and shared different perspectives I would not have normally encountered in my day-to-day. The readings have been helpful to see what other institutions are doing, and we've begun to think about what parts of these programs we want to try.

Below is a version of a discussion forum post I made to the course site. Feel free to add your thoughts, or comments below.

I finished Implementing Guided Pathways at Miami Dade College: A Case Study and had a few thoughts.
  1. The recommendation "Integrate academic programs and student support services." seemed on-point for our campus. The few times I have reached out to support services the results have always been positive, and we achieved more than I could as an individual instructor. I get the sense that faculty occasionally feel like the world is on our shoulders, when we can (and should) share the load with student support service staff. In most cases they may be better trained and equipped to help in certain situations and with specific student populations. I would love to see a way to integrate these two pillars of the college through Canvas, CTC Link, or some other medium.
  2. The recommendation "Increase student engagement through communities of interest." is very appealing, and would strengthen a number of goals in the Academic Plan. These communities could be students from the same meta-major, and supported through a 'wrap-around' class. As the terms progress students could be exposed to other classes they may want to take, student clubs they may find interesting, student government positions that are open, career services for their industry, and the talks and seminars we regularly put on. (The STEM Seminar Series is awesome btw.) I could also envision these communities of interest organized around specific themes, or the big intractable problems of the day. A Global Warming Group could contain students from biology, government, engineering, and a variety of other meta-majors to talk about the causes, 'controversy', solutions, market applications, and the other facets of this problem.
  3. The "Getting Faculty Buy-In at the Front End" issue, in my view, is one of the thornier questions of this entire endeavor. What does the arc of developing these pathways look like when some faculty don't even recognize the problem?
  4. "Because of the initial positive results from the restructured intake process and the added revenue generated by the improved retention, the college's leadership approved the hiring of 25 new full-time advisors." (emphasis mine) This was not something I thought very much about until reading this article; increased retention rates would help provide for the funding of the continued development of pathways. It may also increase our ability to try new initiatives within pathways as they develop over time.
  5. "Overall, the largest threat to institutional redesign at MDC was organizational inertia. Communicating frequently about progress, building consensus, and creating a sense of urgency were vital to creating a sense of shared ownership and to generating momentum for change across the college." Once we leave this institute, what group/committee/mechanism will there be to communicate progress on developing pathways? How are expectations for progress going to be set? Can the 2016 Fall Term Faculty Workdays be structured in a way to move this forward?

Friday, February 19, 2016

Lots of little changes to classes this week.

I've been making little tweaks to my classes based on student performance and responses. All of these actions are being taken on qualitative data, not quantitative data, which isn't a bad thing, but something I want to move away from. I would like to develop some metrics over spring break to put into place next term that would help me make these decisions based on data. Percentage of available homework in the online system (WAMAP) that is complete, number of zero quizzes, and other metrics would help in making data-driven decisions.

My tweaks this week:

  • In Calculus I students were to complete the homework on the related rates section on Tuesday. Most did not. This was not completely surprising, the topic is a physical application, and the setup of each question can take a while. Additionally this homework took quite a bit more time than others, so even if they budgeted for it, they may not have budgeted appropriately. I gave them an additional six days, to Monday at 11:59 PM, the day of our next exam. 
  • Also in Calculus I we are currently talking about graphing functions, using information about the first derivative and second derivative. This is a difficult section because it includes conceptual knowledge about these derivatives, and quite a bit of computation. For today's quiz in the morning class I divided students into two groups. One group would work on graphing one function, the other group, another function. For the first five minutes students were to work on it themselves. Next five minutes students were to work with a partner. Last five minutes students were to working with all the students who had the same function. Each group would have one person present the question. After trying it out, only one group presented, and I finished the other question. To let them using the quiz as a study aide, I allowed them to take it home, but to get credit they would have to email me a hand-drawn graph of the function I presented. 
  • In College Algebra I did not have a pre-made quiz to start the day so I had students take out a sheet of paper, write one of the questions we have been talking about, give it to another student, and have them solve the question. Overall it was a fun activity, albeit a little broccoli covered in cheese. At the end I talked about how I like them making these questions, 
What change in course structure, grading, or presentation did you make on the fly that worked well? That didn't work so well? Feel free to share below. 

Tuesday, February 16, 2016

Same course, different terms, completely different classes.

Last term I taught MATH151 Calculus I in the morning daily, and it felt so RIGHT. The pacing of the class, my in-class examples, questions from students, the schedule, the end of week activities that have students explore different topics, everything felt like the best it could ever be. This term for whatever reason things are not going so well. I'm teaching two sections of the class and both feel wildly different.

The morning class seems tired, not really 'there', and swings between general bewilderment and complete boredom at what we're doing. Test scores are low, and there are still (WEEK 7!) students who haven't registered for the online homework system. I've even started moving back to lecturing two days a week since participation through the in-class examples has been low. There are a number of students who think of mathematics in very linear terms which limits their ability to solve application questions, but at the same time their work is unorganized. Other students are unprepared to complete most of the algebra in the course, whom I fear are not going to pass for this reason. In this class I feel like a task master.

The afternoon class is energetic, but has a habit of going off the rails at the slightest provocation. I have to do a lot of sheep-dogging (making sure the group is together) as we go through each question. In-class examples are better received with this class, and they work well in groups, but questions that require a long, sustained method are difficult. Numeric outcomes for this class are generally positive, but I wonder if they are getting the conceptual understanding down. In this class I feel like a positive guide to the discipline.

I hope this doesn't come across as complaining about my students, it just seems that a class reflects both the instructor and students, as it is a culture both groups are building together. I'm coming to recognize that each class has to be different because it contains different people in it. I may have 'empirical' ('imperial'?) methods and assessments, but if they don't somehow reflect the students in the course am I being as effective as I could be?

Tuesday, February 9, 2016

Experienced Faculty = Font of practical knowledge and hard truths.

I recently had an observation by an experienced faculty member and they gave me some great advice that I thought I would pass along.

  • Trying to use random whiteboard markers that don't work looks bad. Every college classroom has an assortment of whiteboard markers in the tray that people have left. Some work, and some don't. When an instructor tries to make a point, but their marker doesn't work it brings up thoughts of the absent-minded professor who isn't prepared for their class. While this is a minor issue it is one that helps set the tone of the class. Solution? Bring your own supply of whiteboard markers, with some kind of tape or rubber band to mark them. 
  • Every instructor apparently has some kind of verbal tick. Some phrase or series of words that they use as a crutch to fill the empty space between actual words. Mine? "Right?" I have heard that I use "Right?" before, but after forty times this faculty member stopped counting. I think I get into a 'flow' and don't really think about my word usage sometimes. Since being told this I am trying to be very conscientious about the words that come out of my mouth, but sometimes I just get back into that flow. The observer did ask "If its not harmful to students, is it really something you need to worry about?" to which my answer is no. At the same time, I don't like the idea that my language is not controllable and that sometimes I just say stuff. 
  • I pack a lot of material into my courses. Some of it could be done by students ahead of time. In this particular lesson I was having students graph rational functions using their calculator. We were then looking at the patterns in the graphs, factors of the numerator and denominator, etc. looking for the specific patterns we discuss with this subject. The observer mentioned that students can do a lot of this work ahead of time. 
What advice have you received from an observation? Did you incorporate feedback into your teaching?