#IBM study
Explore tagged Tumblr posts
otherworldlyinfo · 1 year ago
Text
A Groundbreaking Study Conducted by IBM Tells Workers to Embrace AI
The Comprehensive Study by IBM: Illuminating the AI LandscapeA Vision Beyond Apprehension: Augmenting Roles with AIStrategic Reskilling and AI Integration: A Recipe for SuccessEvolving Skills: From Technical Proficiency to Interpersonal AcumenSeizing the Future: Leveraging AI for Unprecedented Growth The rapid advancements in artificial intelligence (AI) have ignited both excitement and…
Tumblr media
View On WordPress
0 notes
retrocgads · 1 year ago
Text
Tumblr media
USA 1997
91 notes · View notes
gadgetshowtech · 1 year ago
Text
Digital Science announces new AI patent study: IBM leads Google and Microsoft race to next AI generation
Major players are vying for top position as Generative AI drives technology patent grants up by 16% in the last five years and applications by 31%, according to a new patent study on trends in artificial intelligence by Digital Science company IFI CLAIMS U.S. commercial giants IBM, Google and Microsoft lead the way as the companies with the most patent applications in Generative AI (GenAI), with…
2 notes · View notes
imlearningdata · 2 years ago
Text
Tumblr media
4 notes · View notes
thedigital-witnesses · 21 days ago
Text
The Stargate Project is giving "IBM and the Holocaust."
"The leaders of OpenAI, Oracle and SoftBank outlined a $500 billion, four-year plan to build out U.S. artificial intelligence infrastructure at the White House on Tuesday. The companies will create a joint venture, dubbed Stargate, which is committing an initial $100 billion to build data centers and other campuses, with more investment to come. SoftBank CEO Masayoshi Son will serve as Stargate's chair; Nvidia, Arm and LinkedIn parent Microsoft are also among the technology partners" (https://www.linkedin.com/news/story/tech-giants-unveil-500b-ai-venture-6530417/)
Hmm... makes me think of:
Tumblr media
"IBM & the Holocaust tells of IBM's strategic alliance with Nazi Germany--beginning in 1933 in the 1st weeks that Hitler came to power & continuing well into WWII. As the 3rd Reich embarked upon its plan of conquest & genocide, IBM & its subsidiaries helped create enabling technologies, step-by-step, from the identification & cataloging programs of the 30s to the selections of the 40s. Only after Jews were identified--a massively complex task Hitler wanted done immediately--could they be targeted for efficient asset confiscation, ghettoization, deportation, enslaved labor & annihilation. It was a cross-tabulation & organizational challenge so monumental, it called for a computer. Of course, in the 30s no computer existed. But IBM's Hollerith punch card technology did exist. Aided by the company's custom-designed & constantly updated Hollerith systems, Hitler was able to automate the persecution of the Jews.
Historians were amazed at the speed & accuracy with which the Nazis were able to identify & locate European Jewry. Until now, the pieces of this puzzle have never been fully assembled. The fact is, IBM technology was used to organize nearly everything in Germany & then Nazi Europe, from the identification of the Jews in censuses, registrations & ancestral tracing programs to the running of railroads & organizing of concentration camp slave labor. IBM & its German subsidiary custom-designed complex solutions, anticipating the Reich's needs. They didn't merely sell the machines & walk away. Instead, IBM leased these machines for high fees & became the sole source of the billions of punch cards needed. 
IBM & the Holocaust details the carefully crafted corporate collusion with the 3rd Reich, as well as the structured deniability of oral agreements, undated letters & the Geneva intermediaries--all undertaken as the newspapers blazed with accounts of persecution & destruction. Just as compelling is the human drama of one of our century's greatest minds, IBM founder Thomas Watson, who cooperated with the Nazis for the sake of profit. Only with IBM's technologic assistance was Hitler able to achieve the staggering numbers of the Holocaust. Edwin Black has now uncovered one of the last great mysteries of Germany's war against the Jews: how Hitler got the names."
Tumblr media
0 notes
x86girl · 18 days ago
Text
boxed software and manuals my beloved. i want this so bad.
Tumblr media
53 notes · View notes
onetechavenue · 1 year ago
Text
IBM Study: Widespread Discontent in Retail Experiences, Consumers Signal Interest in AI-Driven Shopping Amid Economic Strain
Only 9% of respondents say they are satisfied with the in-store shopping experience; only 14% say the same for online shopping. Roughly 80% of consumers surveyed who haven’t used AI for shopping expressed an interest in using the technology for various aspects of their shopping journey. Manila, Philippines — As the retail landscape faces mounting pressure from evolving consumer expectations and…
Tumblr media
View On WordPress
0 notes
sbcdh · 2 months ago
Text
The discovery that hypnotic states could be used for market regulatory purposes was nothing short of a revolution for the Federal Reseve. in June 1968, Initial experiments with "financial clairvoyance" were conducted.
The original methodology was fairly simple: fully trained subjects would be placed in a sensory deprivation tank and undergo hypnosonic neuro-induction to the point of sub-finantial emanation. Subjects would remain attuned for 24 hours, at which point they would be de-emanated, and their experiences recorded via interview.
This methodology proved to be an expensive disaster. Repeated cycles of emanation and de-emanation had a catastrophic effect on mental cohesion. On average, subjects would begin to show signs of neuro-depatterning within the first 50 dives, and would slip into permanent catatonia by 350 dives.
Additionally, recovered documents from the period show that information from a single Plutophant was only accurate to within a 2.2i Murdoch deviations, and the interview method introduced a further 5.1i of uncertainty. While experimental attempts to record brain activity directly were underway, technology was primitive, often harmful to the subject, or necessitated invasive surgical modification. Even then, transchronological brain activity proved uniquely difficult to record.
Then, a breakthrough.
The 1960 nationwide upgrade of the minuteman nuclear system was underway, which created a surplus of IBM Drum Storage Drives. Drives that were largely donated to research institutions under the purview of project Clover. With some modifications, the cyclical nature these drum drives proved to be ideal for recording changes in transchronological neuropatterning.
These "Radio-Magnetic Neurological Sensory Arrays" were the predecessor to the modern neuroscope. The first production example, the IBM Y-2, was the size of an entire room, requiring enormous amounts of power and several trained technicians to process the thoughts of a single Plutophant into a human-readable form.
Study is ongoing.
653 notes · View notes
theconstitutionisgayculture · 9 days ago
Note
I think modern society has major issues with mistaking correlation for causation and it's causing a lot of problems.
For example, some years back, there was a study showing that young children from households with a lot of books were scoring higher on literacy tests and doing better academically, whether or not their parents read to them. So everybody decided that books were magical and their mere presence improved kids' ability to learn, and they started a charity to distribute children's books to low-income families.
Now, on a moral level, there is nothing wrong with this. Very few people would argue that it's bad to give books to poor kids. And it probably did some good for some of those kids. But it didn't have the huge dramatic impact that many people were hoping for, because higher literacy rates were not caused by the presence of books. Both of those things were caused by the same third factor.
What kind of person owns a lot of books? What attributes do they value? What traits would they encourage in their children?
It was never about the books. It was always about the parents.
Now for a more disastrous example:
Decades ago, people noticed that college graduates were getting better jobs and earning more money, and they decided that meant everyone should go to college and then everyone would be more successful.
But that's not what happened.
If a particular achievement is seen as optional, then having that achievement says something about you. Back then, a college degree told employers that a prospective hire was someone who went above and beyond, who was willing to work harder to improve their skills and knowledge.
Once college is treated like it's mandatory, a college degree is scarcely more meaningful than a high school diploma.
And the presence of a degree cannot confer upon you the attitude and work ethic that leads to success any more than the presence of books can bestow literacy skills.
Now we have millions of people who took out massive student loans on the promise of success that are left with mountains of debt and mediocre prospects, and we keep shoveling millions more into increasingly corrupted and worthless schools with that same empty promise.
But it was never about the degree. It was always about the kind of person that earned one.
So, my dad was working for IBM back when corporations started listing college degrees as a requirement for employment. He was a data entry guy for the old style punch card computers, which means when someone wanted to ask the computer something they came to him, he set up the punch card, fed it into the computer, and read out the answer. When all these college graduates started getting hired, his job changed. Now, it was his responsibility to train them how to do his job. But, you sensibly ask, didn't they have college degrees? Didn't they learn all this in college? And the answer is yes, they did have college degrees. They all has MBA's, which taught them nothing about how to work computers. IBM just listed "MBA" as a requirement for every non-secretarial/custodial job because they thought having a large number of college graduates on staff sounded good. So these kids spent four years in college only to come out and get not only a low paying data entry job instead of the middle manager job they were expecting, but once they got that job they needed my dad to give them on the job training they could have gotten four years earlier with no money spent on college if the job listing didn't list an MBA as a requirement. In the stories my dad told me, most of these people quit after a year because they were told in college that this degree would get them a better job, and they didn't want to be lowly data entry people.
And nothing's really changed. Jobs that can easily be taught via on the job training or an apprenticeship model require college degrees. Colleges and guidance counselors lie about what kind of job a graduate can expect. And now you have over educated people loading up the Keurig machine at Starbucks to pay off their student debt because there are too many college graduates all going after the same jobs and not enough of those jobs to go around. Mandatory college has always been a scam. It's an artificial requirement that only exists because businesses think it looks good to hire people who have a piece of paper they can hang on the wall. The fact is, only very specialized jobs where on the job training wouldn't work need a college graduate. But there are billions of dollars at stake in the college racket, so on it goes.
108 notes · View notes
scotianostra · 1 year ago
Text
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
On February 16th 1954 the writer Iain Banks was born in Dunfermline, Fife
Banks was a son of a professional ice skater and an Admiralty officer. He spent his early years in North Queensferry and later moved to Gourock because of his father’s work requirement. He received his early education from Gourock and Greenock High Schools and at the young age of eleven, he decided to pursue a career in writing. He penned his first novel, titled The Hungarian Lift-Jet, in his adolescence. He was then enrolled at the University of Stirling where he studied English, philosophy and psychology. During his freshman year, he wrote his second novel, TTR.
Subsequent to attaining his bachelor degree, Banks worked a succession of jobs that allowed him some free time to write. The assortment of employments supported him financially throughout his twenties. He even managed to travel through Europe, North America and Scandinavia during which he was employed as an analyzer for IBM, a technician and a costing clerk in a London law firm. At the age of thirty he finally had his big break as he published his debut novel, The Wasp Factory, in 1984, henceforth he embraced full-time writing. It is considered to be one of the most inspiring teenage novels. The instant success of the book restored his confidence as a writer and that’s when he took up science fiction writing.
In 1987, he published his first sci-fi novel, Consider Phlebas which is a space opera. The title is inspired by one of the lines in T.S Eliot’s classic poem, The Waste Land. The novel is set in a fictional interstellar anarchist-socialist utopian society, named the Culture. The focus of the book is the ongoing war between Culture and Idiran Empire which the author manifests through the microcosm conflicts. The protagonist, Bora Horza Gobuchul, unlike other stereotypical heroes is portrayed as a morally ambiguous individual, who appeals to the readers. Additionally, the grand scenery and use of variety of literary devices add up to the extremely well reception of the book. Its sequel, The Player of Games, came out the very next year which paved way for other seven volumes in The Culture series.
Besides the Culture series, Banks wrote several stand-alone novels. Some of them were adapted for television, radio and theatre. BBC television adapted his novel, The Crow Road (1992), and BBC Radio 4 broadcasted Espedair Street. The literary influences on his works include Isaac Asimov, Dan Simmons, Arthur C. Clarke, and M. John Harrison. He was featured in a television documentary, The Strange Worlds of Iain Banks South Bank Show, which discussed his literary writings. In 2003, he published a non-fiction book, Raw Spirit, which is a travelogue of Scotland. Banks last novel, titled The Quarry, appeared posthumously. He also penned a collection of poetry but could not publish it in his lifetime. It is expected to be released in 2015. He was awarded multitude of titles and accolades in honour of his contribution to literature. Some of these accolades include British Science Fiction Association Award, Arthur C. Clarke Award, Locus Poll Award, Prometheus Award and Hugo Award.
Iain Banks was diagnosed with terminal cancer of the gallbladder and died at the age of 59 in the summer of 2013.
322 notes · View notes
mostlysignssomeportents · 7 months ago
Text
Richard R John’s “Network Nation”
Tumblr media
THIS SATURDAY (July 20), I'm appearing in CHICAGO at Exile in Bookville.
Tumblr media
The telegraph and the telephone have a special place in the history and future of competition and Big Tech. After all, they were the original tech monopolists. Every discussion of tech and monopoly takes place in their shadow.
Back in 2010, Tim Wu published The Master Switch, his bestselling, wildly influential history of "The Bell System" and the struggle to de-monopolize America from its first telecoms barons:
https://memex.craphound.com/2010/11/01/the-master-switch-tim-net-neutrality-wu-explains-whats-at-stake-in-the-battle-for-net-freedom/
Wu is a brilliant writer and theoretician. Best known for coining the term "Net Neutrality," Wu went on to serve in both the Obama and Biden administrations as a tech trustbuster. He accomplished much in those years. Most notably, Wu wrote the 2021 executive order on competition, laying out a 72-point program for using existing powers vested in the administrative agencies to break up corporate power and get the monopolist's boot off Americans' necks:
https://www.eff.org/de/deeplinks/2021/08/party-its-1979-og-antitrust-back-baby
The Competition EO is basically a checklist, and Biden's agency heads have been racing down it, ticking off box after box on or ahead of schedule, making meaningful technical changes in how companies are allowed to operate, each one designed to make material improvements to the lives of Americans.
A decade and a half after its initial publication, Wu's Master Switch is still considered a canonical account of how the phone monopoly was built – and dismantled.
But somewhat lost in the shadow of The Master Switch is another book, written by the accomplished telecoms historian Richard R John: "Network Nation: Inventing American Telecommunications," published a year after The Master Switch:
https://www.hup.harvard.edu/books/9780674088139
Network Nation flew under my radar until earlier this year, when I found myself speaking at an antitrust conference where both John and Wu were also on the bill:
https://www.youtube.com/watch?v=2VNivXjrU3A
During John's panel – "Case Studies: AT&T & IBM" – he took a good-natured dig at Wu's book, claiming that Wu, not being an historian, had been taken in by AT&T's own self-serving lies about its history. Wu – also on the panel – didn't dispute it, either. That was enough to prick my interest. I ordered a copy of Network Nation and put it on my suitcase during my vacation earlier this month.
Network Nation is an extremely important, brilliantly researched, deep history of America's love/hate affair with not just the telephone, but also the telegraph. It is unmistakably as history book, one that aims at a definitive takedown of various neat stories about the history of American telecommunications. As Wu writes in his New Republic review of John's book:
Generally he describes the failure of competition not so much as a failure of a theory, but rather as the more concrete failure of the men running the competitors, many of whom turned out to be incompetent or unlucky. His story is more like a blow-by-blow account of why Germany lost World War II than a grand theory of why democracy is better than fascism.
https://newrepublic.com/article/88640/review-network-nation-richard-john-tim-wu
In other words, John thinks that the monopolies that emerged in the telegraph and then the telephone weren't down to grand forces that made them inevitable, but rather, to the errors made by regulators and the successful gambits of the telecoms barons. At many junctures, things could have gone another way.
So this is a very complicated story, one that uses a series of contrasts to make the point that history is contingent and owes much to a mix of random chance and the actions of flawed human beings, and not merely great economic or historical laws. For example, John contrasts the telegraph with the telephone, posing them against one another as a kind of natural experiment in different business strategies and regulatory responses.
The telegraph's early promoters, including Samuel Morse (as in "Morse code") believed that the natural way to roll out telegraph was via selling the patents to the federal government and having an agency like the post office operate it. There was a widespread view that the post office as a paragon of excellent technical management and a necessity for knitting together the large American nation. Moreover, everyone could see that when the post office partnered with private sector tech companies (like the railroads that became essential to the postal system), the private sector inevitably figured out how to gouge the American public, leading regulators to ever-more extreme measures to rein in the ripoffs.
The telegraph skated close to federalization on several occasions, but kept getting snatched back from the brink, ending up instead as a privately operated system that primarily served deep-pocketed business customers. This meant that telegraph companies were forever jostling to get the right to string wires along railroad tracks and public roads, creating a "political economy" that tried to balance out highway regulators and rail barons (or play them off against each other).
But the leaders of the telegraph companies were largely uninterested in "popularizing" the telegraph – that is, figuring out how ordinary people could use telegraphs in place of the hand-written letters that were the dominant form of long-distance communications at the time. By turning their backs on "popularization," telegraph companies largely freed themselves from municipal oversight, because they didn't need to get permission to string wires into every home in every major city.
When the telephone emerged, its inventors and investors initially conceived of it as a tool for business as well. But while the telegraph had ushered in a boom in instantaneous, long-distance communications (for example, by joining ports and distant cities where financiers bought and sold the ports' cargo), the telephone proved far more popular as a way of linking businesses within a city limits. Brokers and financiers and businesses that were only a few blocks from one another found the telephone to be vastly superior to the system of dispatching young boys to race around urban downtowns with slips bearing messages.
So from the start, the phone was much more bound up in city politics, and that only deepened with popularization, as phones worked their ways into the homes of affluent families and local merchants like druggists, who offered free phone calls to customers as a way of bringing trade through the door. That created a great number of local phone carriers, who had to fend off Bell's federally enforced patents and aldermen and city councilors who solicited bribes and favors.
To make things even more complex, municipal phone companies had to fight with other sectors that wanted to fill the skies over urban streets with their own wires: streetcar lines and electrical lines. The unregulated, breakneck race to install overhead wires led to an epidemic of electrocutions and fires, and also degraded service, with rival wires interfering with phone calls.
City politicians eventually demanded that lines be buried, creating another source of woe for telephone operators, who had to contend with private or quasi-private operators who acquired a monopoly over the "subways" – tunnels where all these wires eventually ended up.
The telegraph system and the telephone system were very different, but both tended to monopoly, often from opposite directions. Regulations that created some competition in telegraphs extinguished competition when applied to telephones. For example, Canada federalized the regulation of telephones, with the perverse effect that everyday telephone users in cities like Toronto had much less chance of influencing telephone service than Chicagoans, whose phone carrier had to keep local politicians happy.
Nominally, the Canadian Members of Parliament who oversaw Toronto's phone network were big leaguers who understood prudent regulation and were insulated from the daily corruption of municipal politics. And Chicago's aldermen were pretty goddamned corrupt. But Bell starved Toronto of phone network upgrades for years, while Chicago's gladhanding political bosses forced Chicago's phone company to build and build, until Chicago had more phone lines than all of France. Canadian MPs might have been more remote from rough-and-tumble politics, but that made them much less responsive to a random Torontonian's bitter complaint about their inability to get a phone installed.
As the Toronto/Chicago story illustrates, the fact that there were so many different approaches to phone service tried in the US and Canada gives John more opportunities to contrast different business-strategies and regulations. Again, we see how there was never one rule that governments could have used if they wanted to ensure that telecoms were well-run, widely accessible, and reasonably priced. Instead, it was always "horses for courses" – different rules to counter different circumstances and gambits from telecoms operators.
As John traces through the decades during which the telegraph and telephone were established in America, he draws heavily on primary sources to trace the ebb and flow of public and elite sentiment towards public ownership, regulation, and trustbusting. In John's hands, we see some of the most spectacular failures as more than a mismatch of regulatory strategy to corporate gambit – but rather as a mismatch of political will and corporate gambit. If a company's power would be best reined in by public ownership, but the political vogue is for regulation, then lawmakers end up trying to make rules for a company they should simply be buying giving to the post office to buy.
This makes John's history into a history of the Gilded Age and trustbusters. Notorious vulture capitalists like Jay Gould shocked the American conscience by declaring that businesses had no allegiance to the public good, and were put on this Earth to make as much money as possible no matter what the consequences. Gould repeated "raided" Western Union, acquiring shares and forcing the company to buy him out at a premium to end his harassment of the board and the company's managers.
By the time the feds were ready to buy out Western Union, Gould was a massive shareholder, meaning that any buyout of the telegraph would make Gould infinitely wealthier, at public expense, in a move that would have been electoral poison for the lawmakers who presided over it. In this highly contingent way, Western Union lived on as a private company.
Americans – including prominent businesspeople who would be considered "conservatives" by today's standards, were deeply divided on the question of monopoly. The big, successful networks of national telegraph lines and urban telephone lines were marvels, and it was easy to see how they benefited from coordinated management. Monopolists and their apologists weaponized this public excitement about telecoms to defend their monopolies, insisting that their achievement owed its existence to the absence of "wasteful competition."
The economics of monopoly were still nascent. Ideas like "network effects" (where the value of a service increases as it adds users) were still controversial, and the bottlenecks posed by telephone switching and human operators meant that the cost of adding new subscribers sometimes went up as the networks grew, in a weird diseconomy of scale.
Patent rights were controversial, especially patents related to natural phenomena like magnetism and electricity, which were viewed as "natural forces" and not "inventions." Business leaders and rabble-rousers alike decried patents as a federal grant of privilege, leading to monopoly and its ills.
Telecoms monopolists – telephone and telegraph alike – had different ways to address this sentiment at different times (for example, the Bell System's much-vaunted commitment to "universal service" was part of a campaign to normalize the idea of federally protected, privately owned monopolies).
Most striking about this book were the parallels to contemporary fights over Big Tech trustbusting, in our new Gilded Age. Many of the apologies offered for Western Union or AT&T's monopoly could have been uttered by the Renfields who carry water for Facebook, Apple and Google. John's book is a powerful and engrossing reminder that variations on these fights have occurred in the not-so-distant past, and that there's much we can learn from them.
Wu isn't wrong to say that John is engaging with a lot of minutae, and that this makes Network Nation a far less breezy read than Master Switch. I get the impression that John is writing first for other historians, and writers of popular history like Wu, in a bid to create the definitive record of all the complexity that is elided when we create tidy narratives of telecoms monopolies, and tech monopolies in general. Bringing Network Nation on my vacation as a beach-read wasn't the best choice – it demands a lot of serious attention. But it amply rewards that attention, too, and makes an indelible mark on the reader.
Tumblr media
Support me this summer on the Clarion Write-A-Thon and help raise money for the Clarion Science Fiction and Fantasy Writers' Workshop!
Tumblr media
If you'd like an essay-formatted version of this post to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2024/07/18/the-bell-system/#were-the-phone-company-we-dont-have-to-care
64 notes · View notes
retrocgads · 1 year ago
Text
Tumblr media
USA 1997
95 notes · View notes
the-most-humble-blog · 27 days ago
Text
The Great College Lie: Why Degrees Don’t Mean Success
Tumblr media
For years, we’ve been fed the same script: Go to college, get a degree, and the world will roll out a red carpet to your success. Sounds simple, right? Except, for many, that “red carpet” feels more like a never-ending hamster wheel of debt, underemployment, and job applications that go straight into the void. So, what happened? Did college lie to us, or did we buy into a dream that was never designed to include everyone?
Let’s dissect The Great College Lie—why the degree doesn’t guarantee success, and what you can do to thrive despite the system.
1. The Promise vs. Reality
The Promise:
College is marketed as the “great equalizer.” They told us education would unlock the American Dream: a steady career, financial security, and a house with a white picket fence. And sure, for some, it worked. But for many others, here’s the reality:
The Reality:
Student Loan Debt: The average college graduate in the U.S. owes $37,000+ in student loans, which can take decades to pay off.
Underemployment: Over 40% of college graduates work jobs that don’t require a degree (hello, barista jobs with a philosophy major).
No Guarantees: That diploma doesn’t protect you from layoffs, market crashes, or a rapidly evolving job market that now demands experience over credentials.
2. Why Degrees Don’t Equal Success
1. It’s About Who You Know, Not What You Know
Networking often outranks education. Studies have shown that up to 70% of jobs are never even posted publicly—they’re filled through connections. Translation? You can have a degree from Harvard, but Chad with zero qualifications might get the job because his dad plays golf with the CEO.
2. Degrees are Losing Their Edge
A bachelor’s degree used to set you apart. Now? It’s almost like having a high school diploma. Everyone has one, which means the competition is fiercer, and employers are raising their standards to include master’s degrees and certifications.
3. The Skills Gap is Real
A piece of paper doesn’t always mean you have the skills employers need. A 2021 survey revealed that 46% of employers feel recent grads aren’t prepared for the workforce. Critical thinking, problem-solving, and real-world experience often trump textbook knowledge.
3. The Student Loan Scam
Let’s call it what it is: a scam. The system was designed to profit off your dreams. Here’s how it works:
Colleges Overpromise: They lure students with flashy marketing, luxurious dorms, and vague promises of a “bright future.”
Loans Trap You: The government and private lenders make it easy to borrow, but repayment terms keep you financially enslaved for decades.
Inflated Costs: College tuition has skyrocketed over 1200% since 1980, far outpacing wage growth. So, you’re borrowing more but earning less.
4. “But College is Still Worth It, Right?”
It depends. For some fields—like medicine, law, and engineering—a degree is non-negotiable. But for many careers, it’s becoming clear that skills and experience matter more than credentials.
Here’s the Shift:
Trade Schools and Certifications: Electricians, plumbers, and tech professionals often earn just as much (or more) than degree holders—with a fraction of the debt.
Freelance and Entrepreneurial Skills: The internet has opened doors to self-taught careers in writing, design, coding, and more.
On-the-Job Learning: Companies like Google, Tesla, and IBM no longer require degrees for many positions—they value skills instead.
5. So, What Should You Do Instead?
1. Learn Marketable Skills
Platforms like Coursera, Udemy, and Khan Academy offer affordable (sometimes free) courses on coding, graphic design, marketing, and more.
The ROI on these courses often far exceeds a traditional degree.
2. Network, Network, Network
Attend local events, join LinkedIn groups, and connect with mentors in your field.
Remember: Jobs often go to those with connections—not just qualifications.
3. Embrace Lifelong Learning
The job market evolves constantly. Staying ahead means continually updating your skills, whether through certifications or self-study.
4. Question the Narrative
Don’t blindly follow the “go to college” script. Ask yourself: What do I want to do, and is college the best path to get there?
6. The Humble Truth About Success
Here’s the real kicker: Success isn’t tied to a degree—it’s tied to your grit, adaptability, and willingness to hustle smart.
Degrees can help, but they aren’t the golden ticket we were promised.
Building real-world skills, learning to market yourself, and forming relationships will often get you farther than any diploma can.
Tumblr media
What They Don’t Want You to Know
The Great College Lie isn’t just about the myth of guaranteed success—it’s about the systems that profit from your hopes and dreams. College can be a valuable tool, but it’s not the only path to success.
The sooner we stop glorifying degrees and start valuing skills, effort, and innovation, the better off we’ll all be. In the meantime, let’s admit one thing: We were all sold a dream. But it’s not too late to wake up and rewrite the story.
28 notes · View notes
duskofastraeus · 9 months ago
Text
Tumblr media Tumblr media
Friday, 31st of May.
Had to pull a sort of all-nighter the day before to finish some French units before the deadline.
Had a private French class in order to analyse my written compositions with my professor
Finished the first course of the IBM Data Science certificate
Started planning out some essays and writing I need to submit during the summer
Notes of the day:
- I’ve been feeling quite fatigued recently, though it is not the first time I am preparing for a language certificate I still feel a bit nervous and must discipline myself into dealing with these emotions rationally.
- A part of me is quite envious of seeing my colleagues enjoying their vacations and time off university while I have to deal with additional examinations/studies and an internship but I should recognise that this surplus work will pay off in the future and I shouldn’t discourage myself.
- I wished I had more time to do readings for the next semester, but for now 2-3 hours of my day will have to suffice. I am a bit anxious for the opening of the application process for next summer’s internships. I need to acquire a research internship in my field of choice and I am not sure if I’ll obtain one in my own university because of the competition between 4 different stages of study going for the same internships…
- Academia aside, I’ve been spending the majority of my days either in libraries or alone in coffee shops doing some work. It is not the first time I spend a summer by myself and I think I’ve learnt to enjoy my own company harmoniously.
56 notes · View notes
icebear4president · 3 months ago
Text
Since men are so ready to take away women’s right to vote and say we’re sooo uneducated and need to know our places, please, have these inventions and scientific discoveries that were credited to men instead 🥰
Hedy Lamarr: Wireless communication. Hollywood actor Hedy Lamarr should actually be the person credited with the invention of wireless communication. During the second World War, Hedy worked closely with George Antheil to develop the idea of "frequency hopping," which would have prevented the bugging of military radios. Unfortunately, the U.S. Navy ignored her patent —and later used her findings to develop new technologies. Years later, her patent was re-discovered by a researcher, which led to Lamarr receiving the Electronic Frontier Foundation Award shortly before her death in 2000.
Alice Ball: Cure for leprosy. Alice Ball was a young chemist at Kalihi Hospital in Hawaii who focused on Hansen's disease, a.k.a. leprosy. Her research sought to find a cure for the disease by figuring out how to inject chaulmoogra oil directly into the bloodstream. Topical treatments worked, but had side effects patients weren't interested in. Sadly, Ball became sick and returned home, where she died in 1916. Arthur Dean took over her study, and Ball became a memory—until a medical journey now referred to the "Ball Method." Her method was used for over two decades all over the world to cure the disease.
Elizabeth Magie Philips: Monopoly. The invention of everyone's favorite board game has been credited to Charles Darrow, who sold it to Parker Brothers in 1935. But it was Elizabeth Magie Phillips who came up with the original inspiration, The Landlord's Game, in 1903. Ironically, she designed the game to protest against monopolists like Andrew Carnegie and John D. Rockefeller.
Marion Donovan: Disposable diapers. In the '40s, new mothers had very few options for diapers. There was cloth...and that was pretty much it. The daughter of an inventor, Marion's first patent was actually for a diaper cover. She later added buttons, eliminating the need for safety pins. Her original disposable diaper was made with shower curtains, with her final one made from nylon parachute cloth. This new method helped keep children and clothes cleaner and dryer, not to mention helping with rashes. But, of course, diaper companies at first ignored her patent.
Vera Rubin: Dark matter. Rubin is the astrophysicist who confirmed the existence of dark matter in the atmosphere. She worked with astronomer Kent Ford in the '60s and '70s, when they discovered the reasoning behind stars' movement outside of the galaxy. She's dubbed a "national treasure" but remains without a Nobel Peace Prize.
Margaret Knight: Square-bottomed paper bags. In 1868, Knight invented a machine that folded and formed flat, square-bottomed brown paper bags. She built a wooden model of the device, but couldn't apply for a patent until she made an iron model. While the model was being developed in the shop, a man named Charles Annan stole the idea and patented it. Though he received credit for it, Knight filed a lawsuit and finally won the rights to it in 1871.
Dr. Grace Hopper: Computer Programming Language. Hopper created the first computer language compiler tools to program the Harvard Mark I computer—IBM's computer that was often used for World War II efforts. Though it's noted in history that John von Neumann initiated the computer's first program, Hopper is the one who invented the codes to program it. One of the programming languages she pioneered, COBOL, is widely used today.
Ada Harris: Hair straightener. Marcel Grateau is often credited for the invention of the hair straightener, but it was Harris who first claimed the patent for it in 1893. (Grateau made his claim to fame with the curling iron around 1852, and we certainly know there's a difference.)
Esther Lederburg: Microbial Genetics. Lederberg played a large part in determining how genes are regulated, along with the process of making RNA from DNA. She often collaborated with her husband Joshua Lederberg on their work on microbial genetics, but it was Esther who discovered lambda phage—a virus that infects E. coli bacteria. Despite their collaboration, her husband claimed the 1958 Nobel Prize for Physiology or Medicine for discoveries on how bacteria mate.
Jocelyn Bell Burnell: Pulsars. Jocelyn Bell Burnell discovered irregular radio pulses while working as a research assistant at Cambridge. After showing the discovery of the pulses to her advisor, the team worked together to uncover what they truly were: Neuron stars, AKA pulsars. Burnell received zero credit for her discovery—instead, her advisor Antony Hewish and Martin Ryle received the Nobel Prize for Physics in 1974.
Chien-Shiung Wu: Nuclear Physics. Often compared to Marie Curie, Chien-Shiung Wu worked on the Manhattan Project, where she developed the process for separating uranium metal. In 1956, she conducted the Wu experiment that focused on electromagnetic interactions. After it yielded surprising results, Tsung-Dao Lee and Chen-Ning Yang, the physicists who originated a similar theory in the field, received credit for her work, winning the Nobel Prize for the experiment in 1957.
Ada Lovelace: Computer algorithm. In the mid-1800s, Ada Lovelace wrote the instructions for the first computer program. But mathematician and inventor Charles Babbage is often credited with the work because he invented the actual engine.
Rosalind Franklin: DNA Double Helix. Franklin's X-ray photographs of DNA revealed the molecule's true structure as a double helix, which was a theory denounced by scientists James Watson and Francis Crick at the time. However, since Watson and Crick originally discovered the (single) helix, they ended up receiving a Nobel Prize for their research.
The ENIAC Programmers: First electronic computer. The ENIAC (Electronic Numerical Integrator and Computer) was the first computer ever built. In 1946, six women programmed this electronic computer as part of a secret World War II project. Inventor John Mauchly is often the only one who gets credit for its creation, but the programmers are the ones who fully developed the machine.
Lise Meitner: Nuclear Fission. Discovered the true power of uranium, noting that atomic nuclei split during some reactions. The discovery was credited to her lab partner Otto Han, who won the Nobel Prize for Physics in 1944
Katherine Johnson: Moon landing. She l discovered the exact path for the Freedom 7 spacecraft to successfully enter space for the first time in 1961 and later for the Apollo 11 mission to land on the moon in 1969. She often went unrecognized by her male colleagues and faced racial discrimination.
Mary Anderson: Windowshield wipers. Anderson first came up with the idea of windshield wipers while riding in a streetcar in the snow. She tried selling her device to companies after receiving the patent for it in 1903, but all of them rejected her invention. It wasn't until the '50s and '60s when faster automobiles were invented that companies took to the idea. By then, Anderson's patent had expired, and later, inventor Robert Kearns was credited with the idea.
Nettie Stevens: Sex chromosomes. Stevens discovered the connection between chromosomes and sex determination. Despite Stevens' breakthrough, her colleague and mentor E.B. Wilson published his papers before her and is often noted for the discovery.
Caresse Crosby: The modern bra. Caresse Crosby, who developed the modern bra. She was the first to acquire the patent for the modern bra, AKA a "Backless Brassiere," yet is often left in the shadows because she sold her patent to the Warner Brothers Corset Company.
12 notes · View notes
origami10 · 7 months ago
Text
AjinWeek24 fic (title TBD)
I was feeling bloodthirsty brainstorming for the week so it's all on the body theme, inspired by the worst things I could come up with.
Author’s note: I’m very sorry to anyone out there who actually knows anatomy reading this. You’ll probably be going “it doesn’t work like that!!” the whole time. I told myself I would study up on it but I haven't, and I wanted to just get this fic out there sometime before 2047. Also I’m very sorry to anyone reading this in general. This is verrrry much hurt/no comfort so if you’re not feeling it, hit that back button and try again later, or just leave, I don’t mind! Dead doves and all that, or dead… main characters. (Please picture origami10 holding a knife and grinning like Satou) My standard for what to write about this time was, if it makes me recoil in horror from having thought it up, that’s what I should write. You’ve been warned.
Day 1: Head
Apparently, the only thing Samuel T. Owen found more exciting than crashing planes into populated areas was torturing Kei personally.
And this time, he wasn’t playing games.
It was down in the sewer where Satou caught up to him. A tremor in the shadows materialized into the flat-topped head of Satou’s IBM, followed by Satou’s narrowed eyes glinting out of the darkness behind. No sooner than Kei could think “Ah, this is ba-“ was the hand of the creature around his throat, jetting him back up against the concrete wall. The impact to his head knocked stars into his vision, and in the brief span he was unconscious, the nearly invisible yet undeniably solid hands of the IBM had moved to pin his biceps back against the cold, rough stone.
“I always make good on my threats, you know.” Satou’s voice was sing-song, but not a single gleam of mirth showed in his eyes. “But, I don’t mind taking my time getting there.”
The IBM’s hands pressed over Kei’s arms like a vice. It faced him head on, and as it leaned its head closer and closer, to Kei’s horror, its flat top lifted to reveal a gaping snake mouth. The yawning maw drew closer and closer, filling his vision like a nightmare.
For something so stiff on the outside, the inside of its mouth as it closed over Kei’s head was as soft and pliable as any human flesh. That flat head slid down flush against the wall, forcing its way over the back of Kei’s head even as its lower jaw swallowed up his face. A more sober Kei might have been intrigued to note that the IBM wasn’t inhibited by the same bone structure that would limit a snake’s bite, but right now a full panic was overriding his usual rationality. Even if he had the reserves to produce his own IBM right now, Satou knew the trick now about attacking its head, and in these close quarters it’d be almost as useless as he was against Satou.
He was completely at Satou’s mercy. The jaws of the IBM closed over his neck.
15 notes · View notes