Is there a social issue that you wished got more attention and awareness than it currently does?
Mental health is something I feel is something that needs more attention to understanding and being educated on. Sure, there has been a spotlight on depression and anxiety in recent years which has led to a better conversation than in the past, but one could argue that these subjects are more or less commercialized and sensationalized than destigmatized in our society. Never mind that there's a lot of other mental illnesses out there, some lesser known than others, that are still incredibly stigmatized in our culture that many are unaware and uneducated on, and it's quite heartbreaking how damaging it is to see the lack of actual discussion there is, let alone people not believing these are real things or even being afraid of them when they're not so-called "attractive" or "acceptable" as what the media portrays them to be. For example, while there's been some good depictions of mental illness in fiction, the fact that we still use psychiatric hospitals in horror genre settings and misrepresent certain mental disorders as "scary" and "othering" is unfortunately really telling. So there needs to be better attempts on stripping away the misconceptions and spreading of misinformation and overall ignorance and discrimination that is sadly connected to mental health and mental illness in general.
Just better education on mental health as a whole, because everyone's mental health is important and should be respected and taken seriously.