Do you think we are better off today than we were 200 years ago? The pilgrims crossed the pond to flee from religious persecution. They were met by the Native Americans, and we all know how that ended. From the moment the United States was created, it has faced trials and tribulations in every decade since its inception. Yet, within these moments of egregious acts in our history, I find a glimmer of hope.
Let’s start with a little walk down the twenith century:
1900

In the early 1900s, we were still shuffling our children of all ages out to work. Since there were no 401K’s or retirement benefits to rely on, parents would send the children, as young as 5, to work 10 or more hours a day in horrific conditions. The coal mines and mills were down with this too because you can pay a child $2 a week as opposed to an adult that can reap 6 to 7 dollars a week. Education was not considered an important factor back then.
However, change did come in the form of activists and reformers, such as those in the Progressive Movement, who worked tirelessly to expose the harsh realities of child labor. Advocates like Lewis Hine, who photographed children working in deplorable conditions, brought national attention to the issue. Writers and journalists published exposés that highlighted the physical and emotional toll on children, shifting public sentiment against child labor. Along with Labor Union advocacy, legislation and industrial advancements that made mechanization reduced the demand for small, nimble hands, which were once considered ideal for certain tasks in factories and mines.
The history of child labor reform is a testament to the power of collective action. Advocates, reformers, and policymakers overcame significant opposition to create laws protecting vulnerable populations. If I had known all this when I was 14, I would not have been as upset about being denied working papers.
1910s: Suppression of Free Speech During WWI
In 1917, the United States entered into WWI and reinstituted the draft that had not been in existence since the civil war. This was met with opposition at home by those who wanted America to remain neutral in the European conflict. Concerned that anti-war rhetoric and public pamphlets might weaken the war effort, President Woodrow Wilson and Congress enacted the Espionage Act of 1917 and the Sedition Act of 1918. These laws made it a crime to use “disloyal, profane, scurrilous, or abusive language” about the U.S. government or military or to deliver speeches intended to “incite insubordination, disloyalty, mutiny, or refusal of duty. Initially the intent was to be used for those that actually participated in espionage, but the extreme use caused 2000 Americans to be imprisoned. The Supreme Court accepted broad interpretations of both the Espionage Act and the Sedition Act, citing necessary limits on free speech during times of war.
Eugene Victor Debs was arrested and convicted to 10 years in prison in the case Debs v. United States, in which the outspoken socialist and presidential candidate simply pledged support for three men who had been jailed for speaking out against the draft there for violating the Espionage and Sedition.
The conviction sparked widespread criticism, viewing it as a violation of civil liberties. While in prison, Debs ran for president in 1920, garnering nearly 1 million votes despite his incarceration. Debs was released from prison in 1921 after President Warren G. Harding commuted his sentence, although his conviction was never overturned. Debs became a symbol of the fight for free speech and worker rights.
Although it did not bode well for Debs and the other 2000 Americans that were imprisoned. It did teach us a few things. Their legacy played a significant role in shaping the modern understanding of civil liberties, spurring advocacy for rights, and refining how governments balance security with individual freedoms.
1920s: Immigration Quotas
Heightened concerns for national security following World War I led to the creation of the Immigration Act of 1924, which also marked the establishment of the first Border Patrol. The act was a reaction to fears that the “traditional” U.S. culture and racial hierarchy would be undermined by immigrants deemed racially and culturally incompatible. The act aimed to preserve the U.S.’s existing racial and ethnic composition by heavily restricting immigration from specific parts of the world while favoring others.
The law established strict quotas based on national origin, calculated using the 1890 Census. This skewed the quotas to favor immigrants from Northern and Western Europe while drastically limiting those from Southern and Eastern Europe, who were seen as “undesirable.” Asian immigrants were completely excluded, extending and codifying the racist principles of the Chinese Exclusion Act of 1882 and barring virtually all immigration from Asia. Ironically, Latin Americans, particularly those from Mexico, Central America, and the Caribbean, were exempt from the restrictive immigration quotas because agriculture relied heavily on this labor.
I found this research very entertaining, and although the types of people we fear allowing into the country have evolved over the years, that fear remains strong. The 1924 Immigration Act marginalized nearly all groups that were not white and of Northern or Western European descent. Its legacy of exclusion and discrimination profoundly shaped the racial and ethnic demographics of the United States for decades.
The silver lining is that the United States now enjoys a rich and diverse culture, thanks to the resilience of marginalized groups who stayed united and preserved their heritage. As someone with Polish ancestry, I recognize that my own roots would have placed me among those excluded during that time. I’ll leave you with this thought: as the world continues to evolve and change, how would you feel if you found yourself in the “not-so-cool” group?
1930s: The Great Depression and the Dust Bowl
After leaving the roaring twenties when times seemed to be at a never-ending high with the stock market soaring and the Charleston all the rage, no one was prepared for what would happen next. Although many relate the collapse of the stock market as the event that drove the United States into such peril, there were many other contributing factors.
Before the Great Depression, the federal government maintained a laissez-faire approach, with minimal regulation of industries, financial markets, or banks. This lack of oversight enabled risky practices, such as speculative stock trading and buying on margin, while the absence of federal deposit insurance left banks vulnerable to collapse. High tariffs, like the Fordney-McCumber Tariff Act of 1922, were designed to protect domestic industries but reduced international trade and strained global economic relationships. There were no federal safety nets, such as unemployment insurance or Social Security, leaving aid for the poor to underfunded state governments and private charities. Pro-business policies, such as reduced corporate taxes and low income taxes for the wealthy, were intended to stimulate growth but instead widened income inequality and reduced consumer purchasing power. Additionally, the government ignored structural problems like agricultural overproduction and falling crop prices, leaving the economy unprepared for the widespread economic collapse of the Great Depression.

The other issue that led to the suffering of the American people occurred during what has been coined the Dust Bowl. The Dust Bowl was a period of severe dust storms in the Great Plains caused by prolonged drought and unsustainable farming practices. Farmers overplowed and removed native grasses, which left soil vulnerable to erosion. Federal policies encouraged over-farming and the rapid settlement of marginal lands in the Great Plains without considering long-term sustainability. Neglect of early warnings about the environmental consequences of over-farming was ignored, leading to widespread soil degradation.
In 1933, President Franklin D. Roosevelt addressed the economic hardships of the Great Depression by introducing the New Deal, which focuses on relief, recovery, and reform.
Key initiatives included the establishment of the Civilian Conservation Corps (CCC), which provided jobs in environmental conservation; the Federal Deposit Insurance Corporation (FDIC), which insured bank deposits to restore trust in the banking system; and the Social Security Act, which introduced pensions for the elderly and support for the unemployed. These programs significantly expanded the federal government’s role in the economy and aimed to provide both immediate relief and long-term structural reforms.
To combat the effects of the “Dust Bowl,” the government and agricultural experts implemented soil conservation methods such as crop rotation, contour plowing, and planting windbreaks. In 1935, the Soil Conservation Service (SCS) was created to encourage sustainable farming practices. Although recovery was gradual, these innovations helped restore some of the land’s productivity and reduced the risk of similar disasters in the future.
My grandmother used to say if you have flour and a potato, you will never starve. This was her mentality that was born from the great depression, and this is why it is still instilled in me to never waste a drop of food. However, how hard this time was, America eventually came out the other side.
1940s: Japanese American Internment

The Japanese American Internment scared me the most out of all. As a result of Pearl Harbor, Franklin D Roosevelt signed an Executive Order 9066 during World War II to force 120,000 Japanese Americans, of which two thirds of them were American Citizens, into internment camps. These camps were often located in remote, harsh environments and surrounded by barbed wire. Families were forced to abandon their homes, businesses, and belongings, enduring poor living conditions and significant emotional and financial hardship. This egregious act was fueled by fear, racism, and wartime hysteria, as Japanese Americans were falsely accused of being potential threats to national security, despite the lack of any evidence of disloyalty. Most complied peacefully, likely believing it would be a temporary measure, only to find themselves detained for three to four years. While a few challenged the orders in court, the Supreme Court ultimately upheld this injustice.
After the camps closed in 1945 and 1946, many internees returned to devastated communities and had to rebuild their lives from scratch. Some relocated to new areas to escape the stigma and discrimination they faced upon returning to their former neighborhoods. Decades later, the Civil Liberties Act of 1988 offered a formal apology and reparations to surviving internees, but the scars of this injustice endured for generations.
The internment of Japanese Americans during World War II was a profound injustice to their civil liberties. I vividly remember George Takei, best known from Star Trek, sharing his family’s harrowing experience when he was just five years old. Despite this, he continues to advocate passionately for civil rights, embodying resilience and hope. My father, who served in the Navy during WWII, visited Japan after the bombing of Hiroshima, and I understand the fear of the time but not the response. Yet, through immense hardship, the Japanese American community displayed incredible strength and grace, emerging from this dark chapter with dignity and determination, a testament to the power of resilience and hope.
1950s: McCarthyism
After WWII there was a growing fear of communism which led to the Cold War. McCarthyism was a period of intense fear and suspicion in the United States during the early years of the Cold War, marked by widespread accusations of communism. It was named after Senator Joseph McCarthy, who spearheaded an anti-communist crusade that targeted government officials, entertainers, academics, and ordinary citizens. This era highlighted the dangers of fear-driven politics and the suppression of civil liberties.
The House Un-American Activities Committee was created and investigated suspected communists, pressuring individuals to name others, further spreading fear and paranoia. This led to baseless accusations, blacklisting and persecution of those that refused to cooperate with the committee. Public trust in government institutions was undermined by the abuse of power. This all sounds too familiar.
However, courageous figures like Edward R. Murrow, a prominent journalist, publicly challenged McCarthy’s tactics. Murrow’s program, See It Now, exposed the senator’s lack of evidence and reliance on fearmongering, shifting public opinion against him. In 1954, McCarthy’s reckless accusations against the U.S. Army during televised hearings led to his downfall. The hearings revealed his bullying tactics and lack of credibility. Later that year, the Senate formally censured McCarthy, marking an end to his influence and discrediting the extreme anti-communist movement.
Although it caused significant harm on individuals and undermined civil liberties at its peak, the courage of those who resisted helped to show the importance of fairness, evidence, and due process. This highlights how, even today, freedom of speech and a free press remain powerful tools in challenging injustice.
1960s: Civil Rights Violations
The 1960’s is when I started this journey called life. I still have a hard time believing this happened in my lifetime. That unfortunately continues to happen at some level today. As you read what life was like for a Black American during this time, I implore you to read it as if you were the one that had to live in this injustice. Imagine the pain, the fear, and the indignity of enduring such systemic oppression. Feel the weight of it as if it were your own, and let that empathy guide your understanding of this painful chapter in our history with the hope of ending it through better understanding.
Life for Black Americans in 1950 was defined by systemic racism, constant fear, and profound inequality. Under the oppressive grip of Jim Crow laws, segregation was a daily humiliation, with separate and vastly unequal schools, restrooms, restaurants, and public spaces. Black Americans were forced to enter through “colored” entrances, sit in the back of buses, and address white people with deference while receiving no respect in return. The fear of lynchings and racial violence loomed large, as Black men, women, and children could be brutally murdered for the smallest perceived offense, often with no consequences for the perpetrators. Economic opportunities were scarce, with Black individuals relegated to menial, low-paying jobs, while housing discrimination confined families to impoverished neighborhoods. Voting was systematically denied through poll taxes, literacy tests, and intimidation, leaving Black Americans without political representation or the power to change their circumstances. Beyond these tangible injustices, the constant dehumanization, through police brutality, false accusations, and daily interactions steeped in racism, created a relentless weight of oppression.
Even in the face of such overwhelming adversity, Black communities demonstrated incredible resilience. Churches, community organizations, and historically Black colleges became sources of strength and leadership, nurturing a rich cultural identity and fostering the seeds of the Civil Rights Movement. Despite the horrors of segregation and racial violence, Black Americans continued to fight for dignity, justice, and equality, laying the foundation for the monumental changes that were to come the following decade.
After the Civil War and the abolition of slavery in 1865 there was hope of racial equality. That did not happen and for decades leading up to the 1960s Civil Rights Movement, organizations like the NAACP (founded in 1909) fought against lynching, segregation, and disenfranchisement, but change was slow. Activism of the 1960s was built on decades of groundwork laid by earlier movements. The combination of heightened awareness, the civil rights legal framework, mass communication, international pressure, and a new generation of activists created the perfect storm for large-scale, sustained action. While systemic racism had persisted for centuries, the social, political, and cultural changes of the mid-20th century made the 1960s a pivotal moment in the fight for equality.
During WWII millions of black Americans fought for democracy oversees only to be denied the privileges at home. The fight against fascism planted the seed for activism at home. There were significant legal victories as well like, Brown v. Board of Education (1954): The Supreme Court declared segregation in public schools unconstitutional, inspiring further challenges to institutionalized racism. Executive Order 8802 (1941): Issued by President Franklin D. Roosevelt, it banned discrimination in defense industries, marking the first federal action against workplace segregation.
Television and mass media were widely accessible, bringing the reality of racial injustice into American living rooms. Images of police brutality, such as the attack on peaceful protesters during the Selma to Montgomery marches, shocked the public and built widespread support for the movement. Leaders like Martin Luther King Jr. used the media to amplify their message and mobilize support nationally and internationally.
The victories of the Civil Rights Movement were monumental, reshaping the legal and cultural landscape of the United States. However, the struggle for racial equality did not end with the 1960s. While the Civil Rights Act of 1964 and the Voting Rights Act of 1965 dismantled many discriminatory practices, systemic racism persisted in new forms. The movement remains a powerful reminder of the importance of activism, resilience, and the fight for justice in overcoming societal evils.
1970s: Watergate Scandal
The Watergate Scandal was a defining moment in American history, exposing corruption at the highest levels of government and shaking public trust in the presidency. The scandal began with a break-in at the Democratic National Committee headquarters at the Watergate complex in Washington, D.C., in 1972, orchestrated by operatives linked to President Richard Nixon’s re-election campaign. It quickly unraveled into a massive abuse of power, as the Nixon administration engaged in illegal activities, including wiretapping, political espionage, and a cover-up to obstruct justice.

The break-in was part of a broader strategy by Nixon’s team to sabotage political opponents, reflecting the administration’s obsession with winning at all costs. A cover-up ensued, with Nixon and his aides using government resources to suppress investigations, bribe witnesses, and mislead the public. Nixon himself participated in discussions about obstructing justice, which were later revealed through secret White House tape recordings. The scandal highlighted deep flaws in campaign finance practices and a willingness to exploit power for personal and political gain.
Investigative Journalism: Reporters Bob Woodward and Carl Bernstein of The Washington Post played a pivotal role in uncovering the scandal, with the help of an anonymous source known as “Deep Throat” (later revealed to be FBI Associate Director Mark Felt). Their reporting brought critical details to light and kept public attention on the issue. Bipartisan efforts in Congress, particularly during the Senate Watergate Committee hearings, exposed the breadth of corruption and abuse within the Nixon administration. The hearings were broadcast on national television, captivating the country and further eroding trust in Nixon. The Supreme Court’s unanimous decision in United States v. Nixon (1974) compelled Nixon to release the White House tapes, which contained damning evidence of his involvement in the cover-up. The American public’s growing outrage and demand for accountability forced Nixon to face impeachment proceedings. On August 8, 1974, Nixon resigned from the presidency, becoming the first U.S. president to do so.
The Watergate Scandal profoundly impacted American politics and governance. In its wake, significant reforms were enacted to restore transparency and public trust, including:
- Campaign Finance Reform: The Federal Election Campaign Act Amendments (1974) placed stricter limits on contributions and increased transparency in campaign funding.
- Government Oversight: The scandal reinforced the importance of checks and balances, strengthening mechanisms to prevent future abuses of executive power.
- Freedom of the Press: Watergate underscored the critical role of investigative journalism in holding power accountable and protecting democracy.
While the Watergate Scandal revealed alarming corruption and abuse of power, it also demonstrated the strength of American institutions. Investigative journalism, bipartisan cooperation, and public vigilance brought about justice and reforms, reinforcing the importance of accountability and transparency in government. This chapter in history serves as both a cautionary tale and a testament to the resilience of democratic principles.
1980s: The Aids Crisis

I was a new nurse during the start of the Aids Crisis. I remember working in the hospital with patients that were ignored due to fear of a new disease that no one understood or cared about for that matter. AIDS was initially labeled a “gay disease” (referred to as GRID, Gay-Related Immune Deficiency) because it disproportionately affected gay men in the early years. This fueled homophobia and slowed efforts to address the crisis. People living with AIDS were often ostracized by their families, fired from jobs, and denied housing or medical care.
The Reagan administration was slow to respond to the crisis. President Reagan did not publicly mention AIDS until 1985, years after the first cases were reported in 1981. Funding for AIDS research and treatment was initially minimal, despite the rapid spread of the disease and rising death toll. Public health agencies like the CDC struggled with inadequate resources to study, track, and combat the epidemic.
Public health campaigns about HIV/AIDS transmission were scarce in the early years, allowing myths and misinformation to flourish. Many people believed HIV could be transmitted through casual contact, leading to irrational fear and discrimination.
By the late 1980s, AIDS had claimed the lives of hundreds of thousands of people worldwide, devastating entire communities. The emotional toll on caregivers, partners, and families was immense, compounded by the social isolation and shame many experienced.
Activists staged protests and public die-ins, forcing AIDS into public consciousness and pressuring policymakers to act. The crisis spurred significant medical breakthroughs in the understanding and treatment of HIV/AIDS. The development of antiretroviral therapies (ART) in the mid-1990s transformed HIV/AIDS from a death sentence into a manageable chronic condition for many people. Public health campaigns eventually educated the public about HIV transmission and prevention, reducing fear and misinformation.
The AIDS crisis galvanized the LGBTQ+ community, fostering unity and activism that extended beyond the epidemic to broader fights for equality and rights. It helped challenge societal norms and increase visibility for LGBTQ+ individuals. The crisis eventually led to greater compassion and understanding for those living with HIV/AIDS. Celebrity figures like Magic Johnson and activism by allies helped normalize discussions around HIV and reduce stigma.
I have worked with many patients affected by this disease, and as a nurse, I witnessed its devastating beginnings firsthand. It’s remarkable how much progress has been made in a relatively short time, which is a testament to the tenacity of a community that refused to be marginalized or forgotten.
1990s: “Don’t Ask, Don’t Tell”
The “Don’t Ask, Don’t Tell” (DADT) policy, enacted in 1993, was a significant moment in LGBTQ+ history, reflecting both progress and the enduring discrimination faced by queer individuals. The policy, introduced by the Clinton administration as a compromise between LGBTQ+ advocacy and military opposition, allowed gay, lesbian, and bisexual individuals to serve in the U.S. military, but only if they kept their sexual orientation hidden. This policy was rooted in prejudice and created a climate of fear and secrecy, ultimately harming countless service members.
Under “Don’t Ask, Don’t Tell” (DADT), LGBTQ+ service members were forced to hide their sexual orientation or risk immediate discharge. While commanders were prohibited from directly asking about someone’s orientation, witch hunts and investigations often took place based on rumors or suspicions. This created a climate of constant fear, leading to significant psychological and emotional tolls for LGBTQ+ individuals, who endured feelings of shame, isolation, and unworthiness as they suppressed their identities to continue serving. Over 13,000 service members were discharged under the policy during its 17 years, losing their livelihoods and military benefits, while the military itself suffered from the loss of skilled personnel.
The repeal of “Don’t Ask, Don’t Tell” (DADT) stands as a powerful testament to the resilience of advocacy and the evolving journey toward equality. Organizations like the Servicemembers Legal Defense Network (SLDN) and Human Rights Campaign (HRC) tirelessly championed the cause, providing legal support to those unjustly discharged and shining a light on the emotional and professional harm caused by the policy. High-profile cases and personal testimonials from service members further amplified the call for change. Over time, public opinion shifted, with growing recognition that LGBTQ+ individuals deserved the right to serve openly. By the 2000s, surveys revealed that most Americans, including many within the military, viewed the policy as outdated and discriminatory. The election of President Barack Obama marked a turning point, as he prioritized repealing DADT, emphasizing its incompatibility with military values and effectiveness. In December 2010, Congress passed the repeal legislation, and on September 20, 2011, LGBTQ+ service members could finally serve openly, marking a historic moment of progress. The repeal not only symbolized a victory for LGBTQ+ rights but also underscored the power of advocacy and the promise of a more inclusive future.
As we are only a quarter of a way into the twenty first century time will tell what other egregious acts we will live through as Americans. We started this century with the Patriot Act, reports of police brutality, systemic racism, climate change denial, and an insurrection. As I was going through and doing this research, I found strands of familiarity in the past that can be applied to today. A common theme in all of this is not hate but fear. People fearing people is such a strange but true concept. If we were all stripped of our skin, nationality or gender would we still fear each other?
As you look around your world today and approach it with trepidation or elation, one thing we can all count on is life is fluid and does not stand still. Everything is temporary and whether we like it or not it will change. We have seen out of all the dangerous mistakes that were made through the years, it ultimately produced great changes.
The fight for equality and civil liberties has to continue until the racism for all cultures can be distinguished. A fundamental promise of this nation remains unfulfilled. I have hope that we will continue to fight to get it right. The power of positivity carries an energy that far surpasses negativity, and it is through hope, unity, and determination, one day we will have liberty and justice for all.
Will the history books judge harshly those who stood by and allowed fear, hate, and divisiveness to define our time? Great legacies have fallen with the passage of time when the dust settled, and the repercussions are revealed. Only time will reveal how the actions, or inactions, of today will shape our legacy and be remembered by future generations.
Patricia A Woods PAW talks hope for the future