TOPS Course Feedback (1/19/2024) #620
Replies: 1 comment
-
Thanks for the comments @AdamER123 and we always appreciate hearing the take aways that researchers have when taking the Open Science 101 curriculum and experiences with open science at large. I do like your suggestion about linking to a definition of certain words that may not be understandable by all. I would definitely encourage you to open an issue or pull request referencing specific parts of the curriculum where you could see some improvements being made. There is information on these mechanisms here . There will be some updates to this content for the curriculum specifically but it can still help you get started with contributing to this repository. The reviewing team may accept some or all of these suggestions. As far as the bug mentioned on the curriculum itself, I would reach out to the curriculum development team at arc-openscience101@mail.nasa.gov to report errors you encounter. |
Beta Was this translation helpful? Give feedback.
-
Dear TOPS team,
I was introduced to this course at the American Astronomical Society (AAS) 243 conference. I take notes as I go through courses, and I wanted to share some feedback for the current version. Per the FAQs and private suggestions, I made a GitHub to share. Already good impact! For context, I am a 6th year PhD student at the University of Rochester. No need to reply, I just thought it would be good to give.
General comments:
Sometimes word clouds are used in activities, and I find it difficult to read the smaller words. Alternative idea is a link to a dictionary or a histogram (bar chart) of a few most, least, and averagely popular words?
There are sometimes errors that popup about content not able to be generated. Unsure why, but the exact error is: "An embedded page at openscience101.org says An error has occurred: Error - unable to acquire LMS API, content may not play properly and results may not be recorded." Probably some sort of driver that is not downloaded (and not an emergency task to fix, I could still active the modules if I refreshed, so it's more just an inconvenience). Related is that some activities didn't work for me (e.g. module 4, approx. activity 6 about SMP), so it might be related to that.
Notes from module 1: In lesson 1, the citizen science or community science link (https://education.nationalgeographic.org/resource/citizen-science/) was taken down. Should either use archive.org (waybackmachine) pages, self-hosted, or perhaps screenshots in those cases (or alternatives). I did notice the White House Office of Science and Technology Policy page was recent, for example, and hopefully that should be well-maintained (which I appreciate).
In lesson 2, the diversity flag itself is fine, but I wonder if it will become dated at some point (e.g. vs. the previous rainbow flag for LGBT+ backgrounds). Sometimes I wonder if it also reaches to international audiences, but I understand NASA's primary goal is likely to US interests (I am not international, so I want to emphasize I cannot speak to that; I just know many students in my program that are and are confused by things like this sometimes - both linguistically and culturally). But I know the team cares and is aware about these matters, especially going into later modules (e.g. open results).
Module 2 (ethos): Learned about continuous integration with GitHub actions, describing software dependencies (e.g. for python, could be pip), and long-term storage can be done with Zenodo but not GitHub necessarily.
Module 4 (open code): Again, reinforced the matters about GitHub and Zenodo, and how they have different functions.
I wonder about one of the quiz questions (apologies, I did not note which the lesson number). I saw one point about data privacy concerns as NOT a reason to share data. But it also seemed like there's a lot of subtlety to this point. Personally, I agree with the quiz answer, but I have encountered some professors who believe the reverse (e.g. the importance of proprietary periods). Speaking honestly, I can understand both sides, though I am always trying to learn more.
It was also very helpful to see the layout of a README! Noting to self it should contain: the name of the project or project portion being addressed (as well as the author of the files and acknowledgement of team members), 1 sentence for what the software is, and 1 sentence in plain-language style that does not assume reader knowledge. Extra additions were a list of any code dependencies, how to install and run the software (e.g. an order of actions), detailed description of the software if no external documentation exists (any common bugs), and an example procedure of how to use the software.
*Finally: I actually write many READMEs and maintain a drive of data for my research group for the past year, and I was always confused on README principles. I did learn some principles about organization though from my library: https://libguides.lib.rochester.edu/dmsplans/organizing (and very much positively thank that this course promotes such types of resources and guidelines!).
Module 5: I notice mentions about blog posts and twitter/tweeting to promote citations. This is entirely anecdotal, but I wonder if this is highly field dependent. I notice the main citation is to G. Eysenbach about medical research in 2011, which is when twitter was especially relevant. I think nowadays (again, anecdotal) we see online interactions can also lead to some downsides about misinformation (not a reason not to share). But this leads me to believe the main benefit to these modalities is to engage more audiences and the public (e.g. aligning to NASA's Universe of Learning/public engagement, NSF's broader impacts, etc) as opposed to citations. One could perhaps add instagram or tiktok as well, but I know media will always change (I'm already out of date with respect to it!).
Sincerely,
Adam
Beta Was this translation helpful? Give feedback.
All reactions