Visible Geology (app.visiblegeology.com) is an interactive tool for building, modifying and exploring 3D geological structures. Features include adding, removing and adjusting Geologic Beds, Geologic Folds, Faults, Domes & Basins, Dikes, Topography, Cross-Sections, Boreholes, and Strike Decals.
Here in EOAS the tool has been used in several courses. Students in the general science course EOSC110, The Solid Earth: A Dynamic Planet use it as a homework exercise to build skills necessary for an awesome follow-up exercise run using worksheets and small groups in the classroom. It involves interpreting the large-scale geological map of the state of Wyoming in terms of geological structures and tectonics of the region.
Getting first year and non-science students to productively interpret ordinary maps is difficult, let alone geological maps! The Visible Geology homework exercise is the second in a three-part activity sequence. It involves self-directed completion of a worksheet followed by an online quiz to test the new geological map interpretation skills. The quiz includes 7 quantitative and qualitative feedback questions.
The success that students demonstrate at 3D thinking and geology map interpretation in the capstone activity is a testament to the benefits of practicing these expert-like skills using Visible Geology. It also reflects the pedagogic expertise of those who developed this 3-part sequence: Brett Gilley and Lucy Porritt.
Further details including analyzed results and feedback from the first use of the VG exercise can be obtained from the F. Jones.
As of May 18th, documentation describing resources built and the tools we used are finally beginning to make sense. These three new pages are accessible from the “Resources” menu above, or as follows:
- Building virtual geo-labs.
- Examples of online activities.
- Resources and tools.
Feel free to comment or request other information.
The “Bloom’s Dichotomous Key” or BDK was developed as part of the EOAS Flexible Learning project in Fall 2014 as a means of judging whether a task or test question causes students to engage in higher or lower order cognitive skills. It isn’t about “difficulty” because there can be difficult lower order (eg memory-based) tasks and easy synthesis or creative tasks.
This effort was based on work done by Casagrand and Semsar in the Dep’t of Integrative Physiology at U. of Colorado, Boulder, but we adapted it for use in geoscience, and based on repeated application by the TLF (Francis Jones) and a teaching assistant (Rhy McMillan).
This link provides a one-page flow chart for applying the key. It is “dichotomous” because Blooms level is arrived at by repeatedly considering yes/no questions about what students are being caused to do. The other two pages provide notes and guidelines plus a simplified flowchart figure. The tool is not officially published, but results have been employed as data for several presentations and workshops, both peer reviewed and not.
See the three-page PDF here: bdk-geoscience.
We are excited to have the following paper accepted for publication in Assessment & Evaluation in Higher Education: “Impact Assessment of a Department-wide Science Education Initiative using Students’ Perceptions of Teaching and Learning Experiences“. The Student Learning Experiences Survey or SLES is an an instrument developed for this work and can be found at http://hdl.handle.net/2429/58046.
Here is the paper’s abstract:
Evaluating major post-secondary education improvement projects involves multiple perspectives, including students’ perceptions of their experiences. In the final year of a seven-year department-wide science education initiative, we asked students in 48 courses to rate the extent to which each of 39 teaching or learning strategies helped them learn in the course. Results were related to the type of improvement model used to enhance courses, class size and course year level. Overall, students perceived unimproved courses as least helpful. Small courses that were improved with support from science education specialists were perceived overall as more helpful than similar courses improved by expert teaching-focused faculty without support, while the opposite was found for medium courses. Overall perceptions about large courses were similar to perceptions of medium courses. Perceived helpfulness of individual strategies was more nuanced and context dependent, and there was no consistent preference for either traditional or newer evidence-based instructional practices. Feedback and homework strategies were most helpful in smaller courses and independently improved courses. Results indicate that students are perceptive to benefits that arise when improvements are made either by expert educators or by research-focused faculty who received dedicated support from science education specialists.