This hits close to where I live; understanding the reasons for engineering failures are a big part of my job, where 99.9% uptime is considered a horrendous failure. Knowing why things fail, and how the cultures of engineering and management can contribute to the problem or prevent it, is vital. IEEEs Spectrum magazine had some early articles looking at the sold-fuel rocket booster (SRB) failures and what engineers did to tell their bosses there were serious concerns. Physicist Richard Feynman did a brilliant analysis of how the Byzantine layers of managers and administrators between booster contractor Morton Thiokol and NASA's launch executives obscured risks* that should have been a screaming red flag well before that tragic launch.
The thumbnail version is that engineers at Morton Thiokol were well aware there were problems with the SRBs, specifically the performance of the O-ring seals between sections at low temperatures, and were doing their best to communicate it. On the day of the Challenger launch, they made a concerted effort -- and it ran headlong into a layered culture that downplayed risk at every step ("Those guys always overstate this stuff by a factor of 10; I'll just cut that down to 5x and pass it on to the next level," very quickly sweeps significant concerns under the rug after a few iterations.) Bad data leads to bad decisions; an excessively-hierarchical structure makes correcting bad data difficult if not impossible. Throw in "launch fever" to get a high profile mission underway and.... Tragedy.
In contrast, medical research is relatively "flat:" once you get to the level of drug researchers with degrees, they fight like a houseful of teen-aged sisters, both within their organizations and then between those companies, labs and universities. Once it reaches human testing, test results are shared -- published -- and analyzed; the testing itself is subject to sharp scrutiny. Bosses don't get to filter risk calculations. And this does not happen because the people, companies, universities and government agencies involved are such lovely, public-spirited people; it happens because they are competitive and suspicious of one another. (Then there's the whole shameful legacy of "human testing," which informs current ethics, law and procedure for such things.) If NASA had six competing contractors making SRBs, and the hope was that most people in the U. S. would get a Space Shuttle trip, do you suppose things might have gone a little differently? If every country with the resources was building and flying Shuttles, and jealously analyzing the ones built by others?
Tl;dr: there's no parallel between the Challenger disaster and the COVID-19 vaccines. And one of the non-parallels is that 60% (and rising) of the U. S. population is a whole lot more people than the grieving survivors of the astronauts killed aboard Challenger by a horrendously-lousy system of administration. Piss off enough people and there's nowhere to hide; all NASA and Morton-Thoikol had to worry about was being dragged in the Press and investigated by Congress. The stakes at risk if they screw up are much higher for Moderna, Pfizer and Johnson & Johnson.
On masks: they are most effective at the source, not the destination. Despite having been vaccinated (and I'm scheduling a booster), I continue to wear masks in shared indoors spaces away from home. I always wear a mask around people who are required to wear a mask around me: the checker at the grocery store has to wear his or her mask the whole shift, while customers breeze through, breathing on 'em all day. It's a very small effort for me to wear a mask for my half-hour or forty-five minutes of shopping, and protect the checker, butcher and stocker from whatever bugs I'm exhaling.
Maybe younger people don't remember this and older ones have forgotten, but before flu shots were widely available, flu season was when people's elderly relations died. COVID-19 is on track to go endemic, just as influenza did, especially after the 1918-20 pandemic. It was not, however, something to shrug off after 1920; it just wasn't overwhelming on a worldwide scale. Bear that in mind; "no worse than a bad cold" for you might still be fatal for the person you to whom you pass it along.
* And not just the SRBs. Having found a management-structure problem that gave NASA decisionmakers ludicrously-low risk estimates for the SRBs, Dr. Feynman looked into the liquid-fueled main engines on the Shuttle and found the exact same thing: engineering estimates of MTBF were routinely inflated by the multiple levels of bosses between the slide-rule/pocket protector engineers who designed and built the engines and the NASA administrators who gave the go-ahead to fly the engines. Small "adjustments" at every level added up to give unrealistic information to the people who most needed accurate data. This kind of under-the-noise-level wishful thinking is exactly what the clunky structure of FDA evaluation is intended to check for.