#Fourth Amendment
Explore tagged Tumblr posts
defensive-tactics · 3 months ago
Text
Someone finally got it right...
52 notes · View notes
odinsblog · 1 year ago
Text
Tumblr media Tumblr media
An internet and privacy watchdog has a warning: Your car is tracking you, and it’s collecting far more information than it needs just to get you where you’re going.
Mozilla, the nonprofit that develops the Firefox browser, released a report Wednesday detailing how the policies of more than two dozen car manufacturers allow for the collection, storage and sale of a wide range of sensitive information about auto owners.
Researchers behind the report said that cars now routinely collect data on par with tech companies, offer few details on how that data is stored and used, and don’t give drivers any meaningful way to opt out.
“Cars are a humongous privacy nightmare that nobody’s seemingly paying attention to,” said Jen Caltrider, who directs Privacy Not Included, a consumer privacy guide run by Mozilla. “And they’re getting away with it. It really needs to change because it’s only going to get worse as cars get more and more connected.”
Unlike Europe, the U.S has few meaningful regulations on how companies trade and store personal data. That’s led to a bustling industry of companies that buy and sell peope’s information, often without their knowledge.
Carmakers have a long list of personal information they say they may track, including employment and purchasing history, education, internet browsing history, location data, music and podcast listening habits, immigration status, religious and philosophical beliefs and health information.
(continue reading)
119 notes · View notes
eternalistic · 2 years ago
Link
“Correctional facilities across the country have a variety of rationales they use to justify this, but largely it boils down to the fact that scanned electronic mail is easier to surveil than physical mail. NYC’s plan is ostensibly in response to a spike in overdoses in NYC’s jail system.”
“Council members also had privacy concerns should mail be recorded off-site by a private contractor. [...] Such data can be retained far into the future and be used against people even if they have never been charged with a crime, have been released from jail, or have had charges dismissed...” “Previous attempts to digitize mail have resulted in first amendment lawsuits.”
145 notes · View notes
dontmeantobepoliticalbut · 1 year ago
Text
Republican state legislators In North Carolina are establishing a new investigative body that Democratic critics have aptly compared to a “secret police force.”
This new entity, formally known as the Joint Legislative Committee on Government Operations, or “Gov Ops” for short, will be chaired by Senate Leader Phil Berger (R) and House Speaker Tim Moore (R). It grants the state the authority to investigate various matters, including “possible instances of misfeasance, malfeasance, nonfeasance, mismanagement, waste, abuse, or illegal conduct.”
Gov Ops, a product of North Carolina’s most recent state budget, was established via a comprehensive bill passed in late September. Despite Democratic Gov. Roy Cooper’s refusal to sign the legislation, the Republican majority in the state legislature pushed it through just 10 days later, thanks to their veto-proof majority and the state’s laws restricting the governor’s ability to make line-item vetoes. Gov Ops is slated to take effect next week.
Any way you slice it, Gov Ops seems like a recipe for government overreach and abuse. If you find yourself under investigation by Gov Ops, you won’t be allowed to publicly discuss any alleged constitutional violations or misconduct by the investigators. All communications with committee personnel would be treated as “confidential.” Shockingly, you’d also be denied the right to seek legal counsel regarding your rights if Gov Ops were to search your property without a warrant, irrespective of whether it’s in a public or private space.
Nora Benavidez, a senior counsel with the nonprofit advocacy group Free Press, told The Daily Beast, “This is a question for the courts ultimately. But the powers granted to the Gov Ops appear to give them overreaching investigative authority, which invokes constitutionality questions.”
A critical aspect of Gov Ops development lies in the language within the statute itself. The key phrase, as highlighted by Republican state legislators, is the investigation of “possible instances of misfeasance.”
It’s unsettling that North Carolina’s Republican state legislators are poised to wield unchecked partisan authority, devoid of any form of accountability, to determine what qualifies as “possible instances of misfeasance.” This newfound investigative power threatens to have far-reaching repercussions on fundamental civil liberties, particularly those closely intertwined with the state legislature—such as voting rights and abortion.
Consider the 2020 election aftermath. Following the election’s conclusion, several North Carolina Republican lawmakers—mirroring Trump and other far-right figures nationwide—demanded access to voting machines, relying on dubious sources and unfounded claims of voter fraud.
Initially, North Carolina Republicans asserted that they would work with police to obtain warrants for such inspections. However, with the advent of Gov Ops, committee leaders could now allege “possible instances of misfeasance,” eliminating the need for a warrant and keeping the public in the dark.
With the 2024 election looming, Republicans in the state legislature will redraw voting maps after the new conservative majority on the state’s Supreme Court legalized partisan gerrymandering. (The Princeton Gerrymandering Project called North Carolina one of the most gerrymandered states in the country.)
The redistricting process in the state has been grueling; since 2011, six different versions of maps have been drawn. The process has been conducted mainly behind closed doors, and North Carolinians continue to express frustration over how they’ve been locked out of the process.
A provision of Gov Ops will likely permit lawmakers drawing the maps to bypass public records requests: “lawmakers responding to public records requests will have no obligation to share any drafts or materials that guided their redistricting decisions.”
Now, let’s look at abortion. During a legislative hearing, state Sen. Graig Meyer (D) asked lawmakers, in a hypothetical scenario, if Gov Ops could access personal health records (like ultrasounds) that are required by the state to receive abortion pills. Sen. Meyer found that Gov Ops, with its widespread ability to investigate with zero oversight, could release information like this “to the public in a hearing” if it wanted to.
youtube
Benavidez explained, “At the end of the day, Gov Ops actions and requests for information are all protected as confidential, adding a layer of opacity which means people in North Carolina will have largely no idea what the Gov Ops entity is really doing.”
The consolidation of power by Republicans in North Carolina through Gov Ops is not just a cause for concern; it is a stark warning sign. The ability of state legislators to wield unchecked authority—shielded from the scrutiny of the voters they are obliged to serve—strikes at the heart of democratic principles.
Transparency and accountability are not optional in a democracy; they are its lifeblood.
When the process of drawing voting maps becomes cloaked in secrecy, when mechanisms to hold our elected officials accountable are dismantled, we risk losing our most cherished rights to our legislators, who should be our staunchest defenders.
Government powers like Gov Ops can potentially erode the very foundations of our democracy—which can’t work if politicians refuse to work for the people and have any accountability.
32 notes · View notes
thatautisticlesbian · 1 year ago
Text
isnt it silly that police don't protect you from other people. Other people protect you from police.
15 notes · View notes
ausetkmt · 4 months ago
Text
Tumblr media
The first time Karl Ricanek was stopped by police for “driving while Black” was in the summer of 1995. He was twenty-five and had just qualified as an engineer and started work at the US Department of Defense’s Naval Undersea Warfare Center in Newport, Rhode Island, a wealthy town known for its spectacular cliff walks and millionaires’ mansions. That summer, he had bought his first nice car—a two-year-old dark green Infiniti J30T that cost him roughly $30,000 (US).
One evening, on his way back to the place he rented in First Beach, a police car pulled him over. Karl was polite, distant, knowing not to seem combative or aggressive. He knew, too, to keep his hands in visible places and what could happen if he didn’t. It was something he’d been trained to do from a young age.
The cop asked Karl his name, which he told him, even though he didn’t have to. He was well aware that if he wanted to get out of this thing, he had to cooperate. He felt at that moment he had been stripped of any rights, but he knew this was what he—and thousands of others like him—had to live with. This is a nice car, the cop told Karl. How do you afford a fancy car like this?
What do you mean? Karl thought furiously. None of your business how I afford this car. Instead, he said, “Well, I’m an engineer. I work over at the research centre. I bought the car with my wages.”
That wasn’t the last time Karl was pulled over by a cop. In fact, it wasn’t even the last time in Newport. And when friends and colleagues shrugged, telling him that getting stopped and being asked some questions didn’t sound like a big deal, he let it lie. But they had never been stopped simply for “driving while white”; they hadn’t been subjected to the humiliation of being questioned as law-abiding adults, purely based on their visual identity; they didn’t have to justify their presence and their choices to strangers and be afraid for their lives if they resisted.
Karl had never broken the law. He’d worked as hard as anybody else, doing all the things that bright young people were supposed to do in America. So why, he thought, can’t I just be left alone?
Karl grew up with four older siblings in Deanwood, a primarily Black neighbourhood in the northeastern corner of Washington, DC, with a white German father and a Black mother. When he left Washington, DC, at eighteen for college, he had a scholarship to study at North Carolina A&T State University, which graduates the largest numbers of Black engineers in the US. It was where Karl learned to address problems with technical solutions, rather than social ones. He taught himself to emphasize his academic credentials and underplay his background so he would be taken more seriously amongst peers.
After working in Newport, Karl went into academia, at the University of North Carolina, Wilmington. In particular, he was interested in teaching computers to identify faces even better than humans do. His goal seemed simple: first, unpick how humans see faces, and then teach computers how to do it more efficiently.
When he started out back in the ’80s and ’90s, Karl was developing AI technology to help the US Navy’s submarine fleet navigate autonomously. At the time, computer vision was a slow-moving field, in which machines were merely taught to recognize objects rather than people’s identities. The technology was nascent—and pretty terrible. The algorithms he designed were trying to get the machine to say: that’s a bottle, these are glasses, this is a table, these are humans. Each year, they made incremental, single-digit improvements in precision.
Then, a new type of AI known as deep learning emerged—the same discipline that allowed miscreants to generate sexually deviant deepfakes of Helen Mort and Noelle Martin, and the model that underpins ChatGPT. The cutting-edge technology was helped along by an embarrassment of data riches—in this case, millions of photos uploaded to the web that could be used to train new image recognition algorithms.
Deep learning catapulted the small gains Karl was seeing into real progress. All of a sudden, what used to be a 1 percent improvement was now 10 percent each year. It meant software could now be used not just to classify objects but to recognize unique faces.
When Karl first started working on the problem of facial recognition, it wasn’t supposed to be used live on protesters or pedestrians or ordinary people. It was supposed to be a photo analysis tool. From its inception in the ’90s, researchers knew there were biases and inaccuracies in how the algorithms worked. But they hadn’t quite figured out why.
The biometrics community viewed the problems as academic—an interesting computer-vision challenge affecting a prototype still in its infancy. They broadly agreed that the technology wasn’t ready for prime-time use, and they had no plans to profit from it.
As the technology steadily improved, Karl began to develop experimental AI analytics models to spot physical signs of illnesses like cardiovascular disease, Alzheimer’s, or Parkinson’s from a person’s face. For instance, a common symptom of Parkinson’s is frozen or stiff facial expressions, brought on by changes in the face’s muscles. AI technology could be used to analyse these micro muscular changes and detect the onset of disease early. He told me he imagined inventing a mirror that you could look at each morning that would tell you (or notify a trusted person) if you were developing symptoms of degenerative neurological disease. He founded a for-profit company, Lapetus Solutions, which predicted life expectancy through facial analytics, for the insurance market.
His systems were used by law enforcement to identify trafficked children and notorious criminal gangsters such as Whitey Bulger. He even looked into identifying faces of those who had changed genders, by testing his systems on videos of transsexual people undergoing hormonal transitions, an extremely controversial use of the technology. He became fixated on the mysteries locked up in the human face, regardless of any harms or negative consequences.
In the US, it was 9/11 that, quite literally overnight, ramped up the administration’s urgent need for surveillance technologies like face recognition, supercharging investment in and development of these systems. The issue was no longer merely academic, and within a few years, the US government had built vast databases containing the faces and other biometric data of millions of Iraqis, Afghans, and US tourists from around the world. They invested heavily in commercializing biometric research like Karl’s; he received military funding to improve facial recognition algorithms, working on systems to recognize obscured and masked faces, young faces, and faces as they aged. American domestic law enforcement adapted counterterrorism technology, including facial recognition, to police street crime, gang violence, and even civil rights protests.
It became harder for Karl to ignore what AI facial analytics was now being developed for. Yet, during those years, he resisted critique of the social impacts of the powerful technology he was helping create. He rarely sat on ethics or standards boards at his university, because he thought they were bureaucratic and time consuming. He described critics of facial recognition as “social justice warriors” who didn’t have practical experience of building this technology themselves. As far as he was concerned, he was creating tools to help save children and find terrorists, and everything else was just noise.
But it wasn’t that straightforward. Technology companies, both large and small, had access to far more face data and had a commercial imperative to push forward facial recognition. Corporate giants such as Meta and Chinese-owned TikTok, and start-ups like New York–based Clearview AI and Russia’s NTech Labs, own even larger databases of faces than many governments do—and certainly more than researchers like Karl do. And they’re all driven by the same incentive: making money.
These private actors soon uprooted systems from academic institutions like Karl’s and started selling immature facial recognition solutions to law enforcement, intelligence agencies, governments, and private entities around the world. In January 2020, the New York Times published a story about how Clearview AI had taken billions of photos from the web, including sites like LinkedIn and Instagram, to build powerful facial recognition capabilities bought by several police forces around the world.
The technology was being unleashed from Argentina to Alabama with a life of its own, blowing wild like gleeful dandelion seeds taking root at will. In Uganda, Hong Kong, and India, it has been used to stifle political opposition and civil protest. In the US, it was used to track Black Lives Matter protests and Capitol rioters during the uprising in January 2021, and in London to monitor revellers at the annual Afro-Caribbean carnival in Notting Hill.
And it’s not just a law enforcement tool: facial recognition is being used to catch pickpockets and petty thieves. It is deployed at the famous Gordon’s Wine Bar in London, scanning for known troublemakers. It’s even been used to identify dead Russian soldiers in Ukraine. The question whether it was ready for prime-time use has taken on an urgency as it impacts the lives of billions around the world.
Karl knew the technology was not ready for widespread rollout in this way. Indeed, in 2018, Joy Buolamwini, Timnit Gebru, and Deborah Raji—three Black female researchers at Microsoft—had published a study, alongside collaborators, comparing the accuracy of face recognition systems built by IBM, Face++, and Microsoft. They found the error rates for light-skinned men hovered at less than 1 percent, while that figure touched 35 percent for darker-skinned women. Karl knew that New Jersey resident Nijer Parks spent ten days in jail in 2019 and paid several thousand dollars to defend himself against accusations of shoplifting and assault of a police officer in Woodbridge, New Jersey.
The thirty-three-year-old Black man had been misidentified by a facial recognition system used by the Woodbridge police. The case was dismissed a year later for lack of evidence, and Parks later sued the police for violation of his civil rights.
A year after that, Robert Julian-Borchak Williams, a Detroit resident and father of two, was arrested for a shoplifting crime he did not commit, due to another faulty facial recognition match. The arrest took place in his front garden, in front of his family.
Facial recognition technology also led to the incorrect identification of American-born Amara Majeed as a terrorist involved in Sri Lanka’s Easter Day bombings in 2019. Majeed, a college student at the time, said the misidentification caused her and her family humiliation and pain after her relatives in Sri Lanka saw her face, unexpectedly, amongst a line-up of the accused terrorists on the evening news.
As his worlds started to collide, Karl was forced to reckon with the implications of AI-enabled surveillance—and to question his own role in it, acknowledging it could curtail the freedoms of individuals and communities going about their normal lives. “I think I used to believe that I create technology,” he told me, “and other smart people deal with policy issues. Now I have to ponder and think much deeper about what it is that I’m doing.”
And what he had thought of as technical glitches, such as algorithms working much better on Caucasian and male faces while struggling to correctly identify darker skin tones and female faces, he came to see as much more than that.
“It’s a complicated feeling. As an engineer, as a scientist, I want to build technology to do good,” he told me. “But as a human being and as a Black man, I know people are going to use technology inappropriately. I know my technology might be used against me in some manner or fashion.”
In my decade of covering the technology industry, Karl was one of the only computer scientists to ever express their moral doubts out loud to me. Through him, I glimpsed the fraught relationship that engineers can have with their own creations and the ethical ambiguities they grapple with when their personal and professional instincts collide.
He was also one of the few technologists who comprehended the implicit threats of facial recognition, particularly in policing, in a visceral way.
“The problem that we have is not the algorithms but the humans,” he insisted. When you hear about facial recognition in law enforcement going terribly wrong, it’s because of human errors, he said, referring to the over-policing of African American males and other minorities and the use of unprovoked violence by police officers against Black people like Philando Castile, George Floyd, and Breonna Taylor.
He knew the technology was rife with false positives and that humans suffered from confirmation bias. So if a police officer believed someone to be guilty of a crime and the AI system confirmed it, they were likely to target innocents. “And if that person is Black, who cares?” he said.
He admitted to worrying that the inevitable false matches would result in unnecessary gun violence. He was afraid that these problems would compound the social malaise of racial or other types of profiling. Together, humans and AI could end up creating a policing system far more malignant than the one citizens have today.
“It’s the same problem that came out of the Jim Crow era of the ’60s; it was supposed to be separate but equal, which it never was; it was just separate . . . fundamentally, people don’t treat everybody the same. People make laws, and people use algorithms. At the end of the day, the computer doesn’t care.”
Excerpted from Code Dependent: Living in the Shadow of AI by Madhumita Murgia. Published by Henry Holt and Company. Copyright © 2024 by Madhumita Murgia. All rights reserved.
2 notes · View notes
thoughtportal · 2 years ago
Link
over the ensuing decades, judges granted more and more powers to the police to stop and search vehicles. In particular, they were given the authority to do so on the mere pretext of suspecting criminal activity – in what is now known as a pretextual traffic stop. But what constitutes a “reasonable” pretext is still a legal gray area. The fourth amendment is supposed to protect us against searches and seizures that are “unreasonable”. The problem is that when fourth amendment cases are brought against police, courts and juries routinely defer to the officer’s testimony.
This judicial tilt in favor of discretionary authority inevitably led to abridgments of civil liberties, and worse.
4 notes · View notes
jbfly46 · 1 year ago
Text
Using the Threads app gives it access to all of your data, including sensitive medical information, which is a violation of HIPAA and your fourth amendment right.
1 note · View note
thats-my-pinwheel · 1 year ago
Text
"The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."
These words used to mean something, and that meaning used to matter.
Tumblr media
hell world!!!! hell world!!!! hell world!!!!
38K notes · View notes
socialjusticefail · 1 day ago
Text
City is using these cameras for crime prevention.
0 notes
liesmyteachertoldme · 29 days ago
Text
The recent reissuance of Department of Defense (DoD) Directive 5240.01 approved at the highest levels of the Pentagon and signed into effect by the Secretary of Defense, represents a significant challenge to the core Constitutional protections that we hold dear. Here’s how it threatens these freedoms:
Violation of the Posse Comitatus Act: This Act limits the powers of the federal government in using military personnel for domestic law enforcement. The new DoD directive, by permitting the use of lethal force through military assistance in civilian law enforcement, effectively overrides these limitations. 
Erosion of the First Amendment: Natural health advocates, and others exercising their First Amendment rights, such as questioning the government's response to COVID-19 or the integrity of elections, are now being labeled as potential domestic extremists. This directive expands those classifications into lethal force interventions, potentially silencing voices under the guise of national security.
Fourth Amendment Infringement: This directive also allows intelligence sharing between military and law enforcement under emergency conditions, raising concerns about the right to privacy and unlawful surveillance.
Due Process Violations (Fifth Amendment): The possibility of military use of lethal force in domestic scenarios introduces concerns about bypassing due process protections before potentially life-ending decisions are made.
Were it not for being personally targeted by this very administration during the COVID-19 era, I might not even be aware or feel compelled to report on a topic that seems out of range for natural health advocacy or health freedom. Yet, this strikes at the heart of what it means to be both healthy and free: for the government to have the authority to use lethal force in emergency situations without adequate transparency or debate is something that should not go unnoticed.
As of today, I can find zero alternative or mainstream media coverage on this directive, and no official government announcement that I could find.
Why is that?
We need to ask these questions. The timing of this change, right before the election and while the South is under incredible stress and pain from these storms (which many believe may have been artificially energized and directed), makes this stealth move by the DoD even more concerning.
Given the extenuating circumstances, I feel it is my duty to warn you all. Please take the time to read the full article and share it with others.
https://mail.google.com/mail/u/0/#inbox/FMfcgzQXJZrFqmBSmHwqcNBCKCtRcVKk
Let’s do our best to stay strong, stay informed, and support one another. Remember, self-care and connection with loved ones is vital in these trying times.
Together, we will get through this, stronger and more resilient than ever.
Warm regards,  Sayer Ji Founder, GreenMedInfo.com
0 notes
tearsofrefugees · 2 months ago
Text
0 notes
nationallawreview · 3 months ago
Text
FCC’s New Notice of Inquiry – Is This Big Brother’s Origin Story?
The FCC’s recent Notice of Proposed Rulemaking and Notice of Inquiry was released on August 8, 2024. While the proposed Rule is, deservedly, getting the most press, it’s important to pay attention to the Notice of Inquiry. The part which is concerning to me is the FCC’s interest in “development and availability of technologies on either the device or network level that can: 1) detect incoming…
0 notes
erebusvincent · 3 months ago
Text
The Fourth Amendment still applies at the border, despite the feds' insistence that it doesn't. 
For years, courts have ruled that the government has the right to conduct routine, warrantless searches for contraband at the border. Customs and Border Protection (CBP) has taken advantage of that loophole in the Fourth Amendment's protection against unreasonable searches and seizures to force travelers to hand over data from their phones and laptops.
But on Wednesday, Judge Nina Morrison in the Eastern District of New York ruled that cellphone searches are a "nonroutine" search, more akin to a strip search than scanning a suitcase or passing a traveler through a metal detector.
Although the interests of stopping contraband are "undoubtedly served when the government searches the luggage or pockets of a person crossing the border carrying objects that can only be introduced to this country by being physically moved across its borders, the extent to which those interests are served when the government searches data stored on a person's cell phone is far less clear," the judge declared.
0 notes
dontmeantobepoliticalbut · 1 year ago
Text
As the national security workforce ages, dementia impacting U.S. officials poses a threat to national security, according to a first-of-its-kind study by a Pentagon-funded think tank. The report, released this spring, came as several prominent U.S. officials trusted with some of the nation’s most highly classified intelligence experienced public lapses, stoking calls for resignations and debate about Washington’s aging leadership.
Sen. Mitch McConnell, R-Ky., who had a second freezing episode last month, enjoys the most privileged access to classified information of anyone in Congress as a member of the so-called Gang of Eight congressional leadership. Ninety-year-old Sen. Dianne Feinstein, D-Calif., whose decline has seen her confused about how to vote and experiencing memory lapses — forgetting conversations and not recalling a monthslong absence — was for years a member of the Gang of Eight and remains a member of the Senate Intelligence Committee, on which she has served since 2001.
The study, published by the RAND Corporation’s National Security Research Division in April, identifies individuals with both current and former access to classified material who develop dementia as threats to national security, citing the possibility that they may unwittingly disclose government secrets.
“Individuals who hold or held a security clearance and handled classified material could become a security threat if they develop dementia and unwittingly share government secrets,” the study says.
As the study notes, there does not appear to be any other publicly available research into dementia, an umbrella term for the loss of cognitive functioning, despite the fact that Americans are living longer than ever before and that the researchers were able to identify several cases in which senior intelligence officials died of Alzheimer’s disease, a progressive brain disorder and the most common cause of dementia.
“As people live longer and retire later, challenges associated with cognitive impairment in the workplace will need to be addressed,” the report says. “Our limited research suggests this concern is an emerging security blind spot.”
Most holders of security clearances, a ballooning class of officials and other bureaucrats with access to secret government information, are subject to rigorous and invasive vetting procedures. Applying for a clearance can mean hourslong polygraph tests; character interviews with old teachers, friends, and neighbors; and ongoing automated monitoring of their bank accounts and other personal information. As one senior Pentagon official who oversees such a program told me of people who enter the intelligence bureaucracy, “You basically give up your Fourth Amendment rights.”
Yet, as the authors of the RAND report note, there does not appear to be any vetting for age-related cognitive decline. In fact, the director of national intelligence’s directive on continuous evaluation contains no mention of age or cognitive decline.
While the study doesn’t mention any U.S. officials by name, its timing comes amid a simmering debate about gerontocracy: rule by the elderly. Following McConnell’s first freezing episode, in July, Google searches for the term “gerontocracy” spiked.
“The President called to check on me,” McConnell said when asked about the first episode. “I told him I got sandbagged,” he quipped, referring to President Joe Biden’s trip-and-fall incident during a June graduation ceremony at the U.S. Air Force Academy in Colorado, which sparked conservative criticisms about the 80-year-old’s own functioning.
While likely an attempt by McConnell at deflecting from his lapse, Biden’s age has emerged as a clear concern to voters, including Democrats. 69% of Democrats say Biden is “too old to effectively serve” another term, an Associated Press-NORC poll found last month. The findings were echoed by a CNN poll released last week that found that 67% of Democrats said the party should nominate someone else, with 49% directly mentioning Biden’s age as their biggest concern.
As Commander In Chief, the President is the nation’s ultimate classification authority, with the extraordinary power to classify and declassify information broadly. No other American has as privileged access to classified information as the president.
The U.S.’s current leadership is not only the oldest in history, but also the number of older people in Congress has grown dramatically in recent years. In 1981, only 4% of Congress was over the age of 70. By 2022, that number had spiked to 23%.
In 2017, Vox reported that a pharmacist had filled Alzheimer’s prescriptions for multiple members of Congress. With little incentive for an elected official to disclose such an illness, it is difficult to know just how pervasive the problem is. Feinstein’s retinue of staffers have for years sought to conceal her decline, having established a system to prevent her from walking the halls of Congress alone and risk having an unsupervised interaction with a reporter.
Despite the public controversy, there’s little indication that any officials will resign — or choose not to seek reelection.
After years of speculation about her retirement, 83-year-old Speaker Emerita Rep. Nancy Pelosi, D-Calif., stunned observers when she announced on Friday that she would run for reelection, seeking her 19th term.
17 notes · View notes
petnews2day · 5 months ago
Text
A Missouri Police Officer Shot a Blind and Deaf Dog. Now He's Being Sued.
New Post has been published on https://petn.ws/RiD71
A Missouri Police Officer Shot a Blind and Deaf Dog. Now He's Being Sued.
A man has filed a lawsuit against the town of Sturgeon, Missouri, a little more than a week after a police officer shot and killed his small, blind, and deaf dog. In a federal lawsuit filed in the U.S. District Court for the Western District of Missouri, Nicholas Hunter alleges that Officer Myron Woodson and […]
See full article at https://petn.ws/RiD71 #DogNews #Dogs, #FourthAmendment, #Missouri, #Pets, #Puppycide
0 notes