#Mainframe Applications
Explore tagged Tumblr posts
shiprasharma2927 · 2 years ago
Text
Tumblr media
Explore modern mainframe migration strategies for a future-ready IT landscape. Embrace agility, cost-efficiency, and innovation.
0 notes
vrnexgen · 5 days ago
Text
Accelerate Digital Transformation with Application Modernization Services – VRNexGen
0 notes
maintectechnologies · 2 months ago
Text
Minimizing Risk in Mainframe Application Upgrades and Migrations
In today’s enterprise landscape, mainframe application upgrades, and migrations are not merely technical exercises, they are strategic imperatives. As organizations strive to modernize legacy infrastructure and align with digital transformation initiatives, the ability to manage risk throughout this transition becomes paramount.
Understanding the Risk Landscape
Mainframe environments are often characterized by deep-rooted complexity. They house decades of business logic written in legacy languages like COBOL and interfacing with numerous mission-critical systems. Any upgrade or migration introduces inherent risk.
These risks typically fall into three primary categories:
Operational Risks: System downtime, degraded performance, and service unavailability.
Technical Risks: Integration failures, data corruption, and regression errors.
Compliance Risks: Gaps in audit trails, data sovereignty concerns, and failure to meet regulatory mandates.
A clear understanding of these risk vectors is essential to crafting a mitigation strategy that safeguards operational continuity and business outcomes.
Conducting a Holistic Assessment Before Initiating Change
Comprehensive assessment forms the foundation of a successful modernization initiative. Organizations must undertake detailed application and infrastructure inventories, documenting interdependencies, data flows, and integration points across the enterprise ecosystem.
This assessment must extend beyond the technical realm to include a business impact analysis. Key questions must be addressed: What are the acceptable thresholds for downtime? Which functions are business-critical? How does risk tolerance vary across departments?
Establishing these parameters ensures that the migration roadmap is aligned with business objectives and resilience expectations.
Strategizing for a Seamless Transition
An effective modernization strategy is phased and deliberate not abrupt. Each stage of the upgrade or migration must be designed to mitigate risk and ensure continuity.
Selecting the appropriate modernization path is critical:
Rehosting: Shifts applications to new platforms with minimal change quicker but often preserves legacy inefficiencies.
Refactoring: Involves re-architecting code for scalability and performance more complex but future-ready.
Replatforming: Balances modernization with risk by adapting applications for new environments without major code changes.
Each approach must be evaluated in the context of long-term scalability, performance goals, and budgetary considerations.
Fortifying Change Management Processes
Robust change management practices are central to minimizing risk. Automation plays a crucial role in enabling thorough testing, validation, and environment simulation prior to deployment.
Regression testing, performance benchmarking, and sandbox testing environments ensure that issues are identified early. Rollback protocols must be established to enable swift recovery in the event of an unforeseen failure.
Version control, configuration management, and detailed documentation are essential enablers of traceability, repeatability, and compliance throughout the transition lifecycle.
Ensuring Cross-Functional Collaboration
Mainframe transformations impact a broad spectrum of stakeholders, including IT, compliance, business units, and third-party vendors. Risk increases significantly when these groups operate in silos.
Successful modernization efforts prioritize integrated governance. Cross-functional alignment ensures that change initiatives are not only technically sound but also operationally feasible and regulatory-compliant.
Structured communication channels, stakeholder alignment workshops, and continuous feedback mechanisms are vital to sustaining collaboration and accountability at every stage of the process.
Post-Migration Governance and Optimization
The completion of a migration or upgrade does not signify the end of the risk management journey it marks the beginning of a new phase: stabilization and optimization.
Continuous monitoring is essential to ensure performance benchmarks are met and anomalies are promptly addressed. Key performance indicators (KPIs) should be regularly reviewed, and service level agreements (SLAs) revalidated.
Additionally, this is the opportune moment to identify and remediate residual technical debt. Optimization initiatives ranging from workload tuning to resource reallocation should be embedded into a continuous improvement framework.
Regulatory compliance should be re-validated to ensure that audit trails, access controls, and data protection measures are aligned with evolving standards.
Conclusion
Mainframe application upgrades and migrations, when executed without strategic rigor, pose significant risks to operational stability and regulatory compliance. However, with a structured approach that prioritizes comprehensive assessment, phased execution, cross-functional collaboration, and ongoing optimization, organizations can modernize with confidence.
By embedding risk mitigation into every phase of the transformation lifecycle and leveraging the precision of mainframe application management services where appropriate enterprises can unlock the full value of modernization while preserving the reliability, security, and performance that define mainframe computing.
0 notes
simonh · 2 months ago
Video
Converting to IBM System/360, 1964 by Colorcubic™ Via Flickr: colorcubic.com/2010/06/04/ibm-system360/
0 notes
avendata85 · 5 months ago
Text
Avendata offers comprehensive legacy Mainframe Systems support, helping businesses seamlessly manage and modernize their legacy infrastructure. With our specialized archiving and migration services, we ensure secure data storage and smooth transitions to newer platforms. Our IT Application Decommissioning services allow you to retire outdated applications, reducing complexity and ensuring compliance, all while maintaining data integrity and minimizing operational disruptions.
0 notes
enterprisemobility · 2 years ago
Text
Digital Transformation Services
Your Switch to a Digital Future – Digital Transformation Consulting Services
Being a leading name amongst Digital Transformation Company and Service providers, Enterprise Mobility has been handholding enterprises on their Digital Transformation journeys for two decades now
1 note · View note
govindhtech · 2 years ago
Text
The Benefits of Mainframe Application Modernization
Tumblr media
The rapid development of innovative technologies, in conjunction with ever-increasing expectations from consumers and continuing disruptive market dynamics, is compelling businesses to place a greater emphasis than ever before on digital transformation. 67% of executive respondents to a recent survey conducted by the IBM Institute for Business Value in cooperation with Oxford Economics stated that their organizations need to transform quickly in order to keep up with the competition, while 57% reported that current market disruptions are placing unprecedented pressure on their IT. The survey was carried out by IBM Institute for Business Value.
Because digital transformation puts enormous demands on current applications and data, an enterprise’s heterogeneous technological environment, which may include cloud and mainframe computing, has to be modernized and integrated. It should come as no surprise that chief executive officers have listed the modernization of their companies’ technologies as one of their highest priorities. CEOs are looking to reinvent their goods, services, and operations in order to increase their organizations’ efficiencies, agility, and speed to market.
In order to run and create services in a consistent manner throughout their hybrid cloud environments, businesses want platforms that are flexible, secure, open, and tailored to their specific needs. Since mission-critical applications continue to take advantage of the capabilities offered by mainframes, the mainframe will continue to be an important component in this process. A hybrid best-fit method is one that supports the modernisation, integration, and deployment of applications. This kind of solution often incorporates both mainframes and the cloud. This improves the agility of the company and tackles the pain points that clients have, such as minimizing the talent gap, shortening the time it takes to bring a product to market, improving access to mission-critical data across platforms, and optimizing expenses.
According to the findings of recent study conducted by the IBM Institute for Business Value, over seven out of ten IT executives believe that mainframe-based applications are an integral part of their corporate and technological strategy. In addition to this, the majority of respondents (68%) believe that mainframes are an essential component of their hybrid cloud approach.
However, modernisation can be a difficult process, and businesses often find themselves up against a variety of obstacles. A poll of CEOs found that about 70 percent of them believe the mainframe-based programs in their companies are outdated and in need of being updated. The survey also finds that businesses are twelve times more likely to use existing mainframe assets in the next two years rather than construct their application estates from scratch, which may be prohibitively expensive, dangerous, or time-consuming. According to a poll of executives working for companies that are now attempting to modernize their mainframe applications, the most difficult obstacle is a shortage of the necessary resources and expertise. When questioned about it two years ago, executives mentioned the high cost of mainframes as a key obstacle. However, this is no longer seen as the case, and instead, executives are searching for other sources of value from mainframes, such as resilience, optimization, and regulatory compliance.
Given that application modernization is necessary for businesses who are concentrating on “best-fit” transformation that spans across mainframe, cloud, or even generative AI, IT executives who are interested in revitalizing their mainframe modernization need to take a few crucial activities right now:
Take a strategy that involves iteration
Consider the characteristics of your sector and the amount of work you do as part of the planning process for integrating new and current settings. Collaborate with your business opponents to co-create a business case and a “best-fit” roadmap geared to fulfill your strategic objectives. Both of these documents should be developed to match your needs. Instead of going with a huge boom, ripping everything out and starting over, you should go with a gradual and ongoing strategy to modernisation.
Perform an analysis of your portfolio, then construct your strategy
Investigate the capabilities that determine the function of the mainframe in your company at the present time, as well as the ways in which those capabilities are connected to the larger ecosystem of hybrid cloud technologies. In addition, you should make it a priority to cross-skilling employees inside the business and rely on your partners to make up for any shortages in talent or resources, whether they are new or already present.
Leverage a number of different access points for application modernization
By employing application programming interfaces (APIs), you may help offer simple access to existing mainframe programs and data. Offer a consistent experience for software developers by combining open-source technologies with a simplified workflow that emphasizes agility. Build apps that are native to the cloud on the mainframe, and containerize existing applications.
Based on a 2021 survey update by the IBM Institute for Business Value (IBV), which conducted a double-blind poll of 200 IT executives in North America in April 2023, “Application modernization on the mainframe – Expanding the value of hybrid cloud transformation.”
0 notes
vax-official · 10 months ago
Text
You might have heard of 32-bit and 64-bit applications before, and if you work with older software, maybe 16-bit and even 8-bit computers. But what came before 8-bit? Was it preceded by 4-bit computing? Were there 2-bit computers? 1-bit? Half-bit?
Well outside that one AVGN meme, half-bit isn't really a thing, but the answer is a bit weirder in other ways! The current most prominent CPU designs come from Intel and AMD, and Intel did produce 4-bit, 8-bit, 16-bit, 32-bit and 64-bit microprocessors (although 4-bit computers weren't really a thing). But what came before 4-bit microprocessors?
Mainframes and minicomputers did. These were large computers intended for organizations instead of personal use. Before microprocessors, they used transistorized integrated circuits (or in the early days even vacuum tubes) and required a much larger space to store the CPU.
And what bit length did these older computers have?
A large variety of bit lengths.
There were 16-bit, 32-bit and 64-bit mainframes/minicomputers, but you also had 36-bit computers (PDP-10), 12-bit (PDP-8), 18-bit (PDP-7), 24-bit (ICT 1900), 48-bit (Burroughs) and 60-bit (CDC 6000) computers among others. There were also computers that didn't use binary encoding to store numbers, such as decimal computers or the very rare ternary computers (Setun).
And you didn't always evolve by extending the bit length, you could upgrade from an 18-bit computer to a more powerful 16-bit computer, which is what the developers of early UNIX did when they switched over from the PDP-7 to the PDP-11, or offer 32-bit over 36-bit, which happened when IBM phased out the IBM 7090 in favor of the the System/360 or DEC phased out the PDP-10 in favor of the VAX.
154 notes · View notes
mariacallous · 3 months ago
Text
Elon Musk’s so-called Department of Government Efficiency (DOGE) has plans to stage a “hackathon” next week in Washington, DC. The goal is to create a single “mega API”—a bridge that lets software systems talk to one another—for accessing IRS data, sources tell WIRED. The agency is expected to partner with a third-party vendor to manage certain aspects of the data project. Palantir, a software company cofounded by billionaire and Musk associate Peter Thiel, has been brought up consistently by DOGE representatives as a possible candidate, sources tell WIRED.
Two top DOGE operatives at the IRS, Sam Corcos and Gavin Kliger, are helping to orchestrate the hackathon, sources tell WIRED. Corcos is a health-tech CEO with ties to Musk’s SpaceX. Kliger attended UC Berkeley until 2020 and worked at the AI company Databricks before joining DOGE as a special adviser to the director at the Office of Personnel Management (OPM). Corcos is also a special adviser to Treasury Secretary Scott Bessent.
Since joining Musk’s DOGE, Corcos has told IRS workers that he wants to pause all engineering work and cancel current attempts to modernize the agency’s systems, according to sources with direct knowledge who spoke with WIRED. He has also spoken about some aspects of these cuts publicly: "We've so far stopped work and cut about $1.5 billion from the modernization budget. Mostly projects that were going to continue to put us down the death spiral of complexity in our code base," Corcos told Laura Ingraham on Fox News in March.
Corcos has discussed plans for DOGE to build “one new API to rule them all,” making IRS data more easily accessible for cloud platforms, sources say. APIs, or application programming interfaces, enable different applications to exchange data, and could be used to move IRS data into the cloud. The cloud platform could become the “read center of all IRS systems,” a source with direct knowledge tells WIRED, meaning anyone with access could view and possibly manipulate all IRS data in one place.
Over the last few weeks, DOGE has requested the names of the IRS’s best engineers from agency staffers. Next week, DOGE and IRS leadership are expected to host dozens of engineers in DC so they can begin “ripping up the old systems” and building the API, an IRS engineering source tells WIRED. The goal is to have this task completed within 30 days. Sources say there have been multiple discussions about involving third-party cloud and software providers like Palantir in the implementation.
Corcos and DOGE indicated to IRS employees that they intended to first apply the API to the agency’s mainframes and then move on to every other internal system. Initiating a plan like this would likely touch all data within the IRS, including taxpayer names, addresses, social security numbers, as well as tax return and employment data. Currently, the IRS runs on dozens of disparate systems housed in on-premises data centers and in the cloud that are purposefully compartmentalized. Accessing these systems requires special permissions and workers are typically only granted access on a need-to-know basis.
A “mega API” could potentially allow someone with access to export all IRS data to the systems of their choosing, including private entities. If that person also had access to other interoperable datasets at separate government agencies, they could compare them against IRS data for their own purposes.
“Schematizing this data and understanding it would take years,” an IRS source tells WIRED. “Just even thinking through the data would take a long time, because these people have no experience, not only in government, but in the IRS or with taxes or anything else.” (“There is a lot of stuff that I don't know that I am learning now,” Corcos tells Ingraham in the Fox interview. “I know a lot about software systems, that's why I was brought in.")
These systems have all gone through a tedious approval process to ensure the security of taxpayer data. Whatever may replace them would likely still need to be properly vetted, sources tell WIRED.
"It's basically an open door controlled by Musk for all American's most sensitive information with none of the rules that normally secure that data," an IRS worker alleges to WIRED.
The data consolidation effort aligns with President Donald Trump’s executive order from March 20, which directed agencies to eliminate information silos. While the order was purportedly aimed at fighting fraud and waste, it also could threaten privacy by consolidating personal data housed on different systems into a central repository, WIRED previously reported.
In a statement provided to WIRED on Saturday, a Treasury spokesperson said the department “is pleased to have gathered a team of long-time IRS engineers who have been identified as the most talented technical personnel. Through this coalition, they will streamline IRS systems to create the most efficient service for the American taxpayer. This week the team will be participating in the IRS Roadmapping Kickoff, a seminar of various strategy sessions, as they work diligently to create efficient systems. This new leadership and direction will maximize their capabilities and serve as the tech-enabled force multiplier that the IRS has needed for decades.”
Palantir, Sam Corcos, and Gavin Kliger did not immediately respond to requests for comment.
In February, a memo was drafted to provide Kliger with access to personal taxpayer data at the IRS, The Washington Post reported. Kliger was ultimately provided read-only access to anonymized tax data, similar to what academics use for research. Weeks later, Corcos arrived, demanding detailed taxpayer and vendor information as a means of combating fraud, according to the Post.
“The IRS has some pretty legacy infrastructure. It's actually very similar to what banks have been using. It's old mainframes running COBOL and Assembly and the challenge has been, how do we migrate that to a modern system?” Corcos told Ingraham in the same Fox News interview. Corcos said he plans to continue his work at IRS for a total of six months.
DOGE has already slashed and burned modernization projects at other agencies, replacing them with smaller teams and tighter timelines. At the Social Security Administration, DOGE representatives are planning to move all of the agency’s data off of legacy programming languages like COBOL and into something like Java, WIRED reported last week.
Last Friday, DOGE suddenly placed around 50 IRS technologists on administrative leave. On Thursday, even more technologists were cut, including the director of cybersecurity architecture and implementation, deputy chief information security officer, and acting director of security risk management. IRS’s chief technology officer, Kaschit Pandya, is one of the few technology officials left at the agency, sources say.
DOGE originally expected the API project to take a year, multiple IRS sources say, but that timeline has shortened dramatically down to a few weeks. “That is not only not technically possible, that's also not a reasonable idea, that will cripple the IRS,” an IRS employee source tells WIRED. “It will also potentially endanger filing season next year, because obviously all these other systems they’re pulling people away from are important.”
(Corcos also made it clear to IRS employees that he wanted to kill the agency’s Direct File program, the IRS’s recently released free tax-filing service.)
DOGE’s focus on obtaining and moving sensitive IRS data to a central viewing platform has spooked privacy and civil liberties experts.
“It’s hard to imagine more sensitive data than the financial information the IRS holds,” Evan Greer, director of Fight for the Future, a digital civil rights organization, tells WIRED.
Palantir received the highest FedRAMP approval this past December for its entire product suite, including Palantir Federal Cloud Service (PFCS) which provides a cloud environment for federal agencies to implement the company’s software platforms, like Gotham and Foundry. FedRAMP stands for Federal Risk and Authorization Management Program and assesses cloud products for security risks before governmental use.
“We love disruption and whatever is good for America will be good for Americans and very good for Palantir,” Palantir CEO Alex Karp said in a February earnings call. “Disruption at the end of the day exposes things that aren't working. There will be ups and downs. This is a revolution, some people are going to get their heads cut off.”
15 notes · View notes
bah-circus · 4 months ago
Note
HAIIII could we request beatrix lebeau (slime rancher) and subspace (phighting!) ? both acrobats ! just beatrix if you don't wanna do multiple requests , we don't mind .3c !! /gen your packs r oh so cool and I love them dearly btw ,,
Thank you in advance !! - @toxin-filled-bahs
Of course dear audience! We have heard your request and have found a suitable performer for you! We hope this performance suits your needs, but you are free to make any adjustments you wish.
❣︎For Our Next Act, Please Welcome,,,❣︎
Beatrix Lebeau & Subspace & Bonus Medkit!!!
Tumblr media
°·⊱ Name: Beatrix LeBeau, Bee, Blossom, Reena, Felicity, April
°·⊱ Age: 22
°·⊱ Race/Species: Human
°·⊱ Source: Slime Rancher 1 & 2
°·⊱ Role: Caregiver, Soother
────── · · · · ──────
°·⊱ Sex: Male
°·⊱ Gender: transFem, SpringAdored, Bloomlexic, Vernal Sunsettic, Sungender, Icarusalis 
°·⊱ Pronouns: Shi/Hir; Ae/Aem; Fleu/Fleur; Sun/Suns
°·⊱ Sexuality: Sapphic 
°·⊱ Personality: Beatrix is a very ‘down-to-earth’ type of rancher. Likes to keep things well organized and well planned. Finds large cityscapes to be very overwhelming, and prefers laid back and scenic areas. 
────── · · · · ──────
°·⊱ Nicknames/Titles: The Lonely Rancher, [Prn] Who Explores, [Prn] Who Protects Slimes
°·⊱ Likes: Hard Work, Chores, Slime, Pet Care, Reading, Baking, Mochi Miles, Listening to Mochi explain Anything, Carrots, Plort Collectors, Her Twin, Silence
°·⊱ Dislikes: Jerks, Cityscapes, High Tech [Confusing], Veggie Chips, Tar Slimes [Wants to learn to reverse/cure them], Plort Market [Evil]
°·⊱ Emoji Sign-Off: 🐝🌷🐇💐🥣🌤️
°·⊱ Typing Quirk: Has a thick country / southern accent that shows through in how shi talks in almost all circumstances, very laid back typing and talking style. 
°·⊱ Faceclaim: 1 | 2
Tumblr media Tumblr media Tumblr media
°·⊱ Name: Subspace, Specimen, Creator, Nox, Abris, Leonis, Turing, Toxin, Mainframe, Ash, Hollow, Lethe
°·⊱ Age: nullage (it just doesn't care so it stopped counting)
°·⊱ Race/Species: Robloxian, Inphernal / Demon
°·⊱ Source: Phighting! (Roblox)
°·⊱ Role: Prosecutorflux, audeomate, BPD Holder (if applicable)
────── · · · · ──────
°·⊱ Sex: Null
°·⊱ Gender: RXgender, Deosueial, missingsourcegender, spitegender, honeyhimhypic, rotgender, boycorpse, fleshripped, facewoundic, canidevolin, seruadoric, acheantoxic, poisongender, !gender, mischeiviposic, gummybatgender, lovegoric, genderthreat, b?y
°·⊱ Pronouns: it/its; rot/rots; decay/decays; dea/death; vial/vials; tox/toxic; bloom/blooms; ny/nym; cy/cyr; go/gore; he/hem/hemo; wou/wound; h?/h?m; bio/bios; 🧪/🧪s; ☢️/☢️s; rad/radium; radi/radioactive; gli/glitch; pix/pixel; .exe/.exes; ..?/..?s; creepy/creeping; voi/void; cru/crux; rev/revive; gut/guts; kill/kills; end/ends; carc/carcass;  vile/viles; h+/h-m; vi/vital; no/non; that thing/that thing’s; quoi/quoir; sie/hir; ?/?s; !!/!!s; 01/01s; ⚗️/⚗️s; 💊/💊s
°·⊱ Sexuality: Abrosexual
°·⊱ Personality: Energetic, proud, boastful, obsessive, talks a lot, annoying (on purpose, but also not), 
────── · · · · ──────
°·⊱ Nicknames/Titles: Creator of Biografts, Blackrock’s Scientist, The Toxic One, (prn) who poisons, Spec, Noxxy, Abri, Leon, Tox
°·⊱ Likes: Poison, Medkit, energy drinks, candy, loud music, EDM, metal, masks, Biograft, fighting, video games, snakes, technology, inventing, poisonous/toxic flowers and plants, Blackrock, annoying people, science
°·⊱ Dislikes: Medkit, dogs, juice, losing, non-deadly flowers, minimalism, being tired, Medkit again, boring things, math (even though tox is good at it)
°·⊱ Emoji Sign-Off: 🧪☢️⚗️
°·⊱ Typing Quirk: //TYPES IN ALL CAPS LIKE THIS!!!!!!!!//
°·⊱ Extra: Has a love/hate relationship with Medkit
°·⊱ Faceclaim: 1 | 2
Tumblr media Tumblr media Tumblr media
°·⊱ Name: Medkit, Aster, Aegis, Hannibal, Cypher, Shard, Servo, Remedy, Nano
°·⊱ Age: 39
°·⊱ Race/Species: Robloxian, Inphernal / Demon
°·⊱ Source: Phighting! Roblox
°·⊱ Role: Observer, Archivist, Academic
────── · · · · ──────
°·⊱ Sex: Intersex
°·⊱ Gender: Galactic Transneu, Frilledcure, Bxy
°·⊱ Pronouns: he/him; gli/glitch; doc/docs; heal/healths; syr/syringe; pill/pills; RX/RXs; bru/bruise; rad/rads; bio/bios; haz/hazards; h?/h?m; vio/violent; cy/cyan; heal/heals; hie/hier; mal/ware/malwares; h-/h+m; via/vial, vi/virus; 👁️‍🗨️/👁️‍🗨️s
°·⊱ Sexuality: Achillean, Idiotsexual/hj
°·⊱ Personality: Stuck-up and Sarcastic. Medkit is asocial, preferring to stay to himself. Gli does not show his emotions well, and tends to snap at others fairly quickly. Though below it all heal does care, RX just doesn’t know how to show it. 
────── · · · · ──────
°·⊱ Nicknames/Titles: Blackrock Deserter, Meddy, Med, Remi, [Prn] Who Opposses Subspace, One of The Lost Temple, Member of the Church of the TRUE EYE, [Prn] Who Heals with Bullets
°·⊱ Likes: Subspace, Chess, Sword [Person], Black Coffee, Classical Music, Organization, Money, Engineering, Unseasoned Food, New Energy Sources
°·⊱ Dislikes: Subspace, Juice, Childish Things, Being a Doctor, Boombox, Biograft, Loud Music, Chaos, Being called Doctor, Crows
°·⊱ Emoji Sign-Off: 💉🩹💊🩵🧪☕
°·⊱ Typing Quirk: This Should Do Nicely For A Typing Quirk, Yes? Everything Capitalized. 
°·⊱ Faceclaim: 1 | 2
Tumblr media Tumblr media Tumblr media
An absolutely lovely request!! Thank you! It's extremely kind to hear you enjoy the packs that we provide here!! Bulb and I try our best to put as much care as possible into each one!! It's a little funny seeing you here for a Phighting! introject actually because we had plans to request one from you eventually!! Small world we live in <3 - Pest Swarm
Tumblr media Tumblr media
13 notes · View notes
seastarblue · 7 months ago
Text
Bold the Facts Tag!
YAY another one! Thanks for the tag @sunflowerrosy !
Last time was Kaiden’s, so I’m doing… Arbor today! where Kaiden is the protag of Interwoven, Arbor is the protag of AGGTRG! (A Golem’s Guide to Regaining Goodness)
Personal
Financial: wealthy / moderate / unsure / poor / in extreme poverty
Medical: fit / moderate / sickly / disabled / non-applicable (Golem = not able to be defined by humanoid standards)
Class: upper / middle / working / unsure / other
Education: qualified /unqualified / studying / other
Criminal record: yes, for major crimes / yes, for minor crimes / no / has committed crimes but not caught yet / yes, but charges were dismissed
Family
Children: has a child or children / has no children (Fiamma his feral dragon daughter 🥹)
Relationship with family: close with sibling(s) / not close with sibling(s) /has no siblings / sibling(s) is deceased / adding a has no family here
Affiliation: orphaned / abandoned / adopted / found family / disowned / raised by birth parent(s) / not applicable (He doesn’t remember :>)
Traits/Tendencies
Introverted / ambivalent / extroverted
Disorganized / organized / in between
Close-minded / open minded / in between
Calm / anxious / in between / contextual / energetic
Disagreeable / agreeable / in between
Cautions / reckless / in between / contextual
Patient / impatient / in between / contextual
Outspoken / reserved / in between / contextual
Leader / follower / in between / contextual
Empathetic / vicious bastard / in between / contextual
Optimistic / pessimistic / in between
Traditional / modern / in between
Hard working / lazy / in between
Cultured / uncultured / in between / unknown
Loyal / disloyal / unknown / contextual
Faithful / unfaithful / unknown / contextual
Beliefs
Faith: monotheistic / polytheistic / agnostic / atheist / adding an unsure (which normally I’d consider agnostic but in this case? Nah.)
Belief in ghosts or spirits: yes / no / don’t know / don’t care / in a matter of speaking
Belief in an afterlife: yes / no / don’t know / don’t care / in a matter of speaking
Artistic skills: excellent / good / moderate / poor / none (he can write but that’s the extent of it)
Technical skills: excellent / good / moderate / poor / none
Habits
Drinking alcohol: never / special occasions / sometimes / frequently / tried it / alcoholic / former borderline alcoholic turned sober
Smoking: tried it / trying to quit / already quit / never / rarely / sometimes / frequently / chain smoker ([Smoking… there was a time I was on fire…?] ahh character)
Recreational drugs: tried some / never / special occasions / sometimes / frequently / addict
Medicinal drugs: never / no longer needs medication / some medication needed / frequently / to excess / definitely needs some psych meds but doesn’t have access
Unhealthy food: never / special occasions / sometimes / frequently / binge eater (he doesn’t eat :>)
Splurge spending: never / rarely / sometimes / frequently / shopaholic
Gambling: never / rarely/ sometimes / frequently / compulsive gambler
———
no pressure tagging the Tag Game List! Lemme know if you’d like on/off:
@sableglass @dioles-writes @viridis-icithus @allaboutmagic @paeliae-occasionally
@astor-and-the-endless-ink @vsnotresponding @nightlylaments @ancientmyth @vesanal
@thebookishkiwi @verdant-mainframe @threedaysgross @fifis-corner @bamber344
and as always, open tag!
17 notes · View notes
shiprasharma2927 · 2 years ago
Text
Mainframe Performance Optimization Techniques
Tumblr media
Mainframe performance optimization is crucial for organizations relying on these powerful computing systems to ensure efficient and cost-effective operations. Here are some key techniques and best practices for optimizing mainframe performance:
1. Capacity Planning: Understand your workload and resource requirements. Accurately estimate future needs to allocate resources efficiently. This involves monitoring trends, historical data analysis, and growth projections.
2. Workload Management: Prioritize and allocate resources based on business needs. Ensure that critical workloads get the necessary resources while lower-priority tasks are appropriately throttled.
3. Batch Window Optimization: Efficiently schedule batch jobs to maximize system utilization. Minimize overlap and contention for resources during batch processing windows.
4. Storage Optimization: Regularly review and manage storage capacity. Employ data compression, data archiving, and data purging strategies to free up storage resources.
5. Indexing and Data Access: Optimize database performance by creating and maintaining efficient indexes. Tune SQL queries to minimize resource consumption and improve response times.
6. CICS and IMS Tuning: Tune your transaction processing environments like CICS (Customer Information Control System) and IMS (Information Management System) to minimize response times and resource utilization.
7. I/O Optimization: Reduce I/O bottlenecks by optimizing the placement of data sets and using techniques like buffering and caching.
8. Memory Management: Efficiently manage mainframe memory to minimize paging and maximize available RAM for critical tasks. Monitor memory usage and adjust configurations as needed.
9. CPU Optimization: Monitor CPU usage and identify resource-intensive tasks. Optimize code, reduce unnecessary CPU cycles, and consider parallel processing for CPU-bound tasks.
10. Subsystem Tuning: Mainframes often consist of various subsystems like DB2, z/OS, and MQ. Each subsystem should be tuned for optimal performance based on specific workload requirements.
11. Parallel Processing: Leverage parallel processing capabilities to distribute workloads across multiple processors or regions to improve processing speed and reduce contention.
12. Batch Processing Optimization: Optimize batch job execution by minimizing I/O, improving sorting algorithms, and parallelizing batch processing tasks.
13. Compression Techniques: Use compression algorithms to reduce the size of data stored on disk, which can lead to significant storage and I/O savings.
14. Monitoring and Performance Analysis Tools: Employ specialized tools and monitoring software to continuously assess system performance, detect bottlenecks, and troubleshoot issues in real-time.
15. Tuning Documentation: Maintain comprehensive documentation of configuration settings, tuning parameters, and performance benchmarks. This documentation helps in identifying and resolving performance issues effectively.
16. Regular Maintenance: Keep the mainframe software and hardware up-to-date with the latest patches and updates provided by the vendor. Regular maintenance can resolve known performance issues.
17. Training and Skill Development: Invest in training for your mainframe staff to ensure they have the skills and knowledge to effectively manage and optimize the system.
18. Cost Management: Consider the cost implications of performance tuning. Sometimes, adding more resources may be more cost-effective than extensive tuning efforts.
19. Capacity Testing: Conduct load and stress testing to evaluate how the mainframe handles peak workloads. Identify potential bottlenecks and make necessary adjustments.
20. Security Considerations: Ensure that performance optimizations do not compromise mainframe security. Balance performance improvements with security requirements.
Mainframe performance optimization is an ongoing process that requires constant monitoring and adjustment to meet evolving business needs. By implementing these techniques and best practices, organizations can maximize the value of their mainframe investments and ensure smooth and efficient operations.
0 notes
trivialbob · 2 years ago
Text
Tumblr media
For work I have to sign in through a virtual desktop. I hate it.
Compared to pre-virtual desktop days it takes me longer to get signed in. There's often a lag time between pressing a keyboard button and a letter or number appearing on my screen. This delay is a small fraction of a second. But I notice it, and if I type many characters the delay becomes more noticeable.
Death by a thousand cuts.
This morning the virtual desktop system isn't working at all. Under the old way, where we had each application installed on our laptops, if something wasn't working with, for example, the mainframe connection I could still get into Outlook, and vice versa. Virtual desktop is all or zero.
And I'm a zero today.
As I wait I got out my air compressor and started to clean things. Computer keyboards. Vacuum cleaners. That ridge around the top of the Instant Pots. Coffee makers (I have three different types).
I find it soothing (though noisy) getting things clean down to the smallest nooks and crannies.
I probably *cough* should have worn a mask though.
52 notes · View notes
simonh · 2 months ago
Video
Converting to IBM System/360, 1964 by Colorcubic™ Via Flickr: colorcubic.com/2010/06/04/ibm-system360/
0 notes
foundationhq · 1 year ago
Text
Tumblr media
ACCESS GRANTED TO SITE-φ.
Welcome, 𝐻𝐴𝑅𝑃𝐸𝑅. 𝚃𝚑𝚎 𝙰𝚍𝚖𝚒𝚗𝚒𝚜𝚝𝚛𝚊𝚝𝚘𝚛 is pleased to scout you for the role of [𝑄𝑈𝑂𝑇𝐸 𝑈𝑁𝑄𝑈𝑂𝑇𝐸].
A Succinctly Candid Perspective. A Subtly Crafted Pretense. A Superbly Charismatic Person. Matias — “Loch” — is an amazing find for the Foundation, as well as for us reading your application. It is not lost on us that this role was one of the Foundation's newest acquisitions, and yet your take on him was so intimate, familiar to us. The thorough rapport one feels for a relationship spanning years, every quip so quotable. Loch’s vision, a most singular scope found in his story, elevated him to gleeful heights. Where that zest for anomalies comes from is removed from just fantastical fancy. There is real love in there, protective and strong for the causes he believes in and the people he adores. Really, an absolute treat of an app. We’d let him hack our mainframe any day. Please don’t, though. We need that. We are so incredibly happy to invite you into the Foundation.
Please refer to our checklist for primary onboarding, and have your account ready in 24 hours. The flight to Site-φ leaves on the dot. And 𝚃𝚑𝚎 𝙰𝚍𝚖𝚒𝚗𝚒𝚜𝚝𝚛𝚊𝚝𝚘𝚛 doesn't like to be kept waiting.
5 notes · View notes
readingsquotes · 1 year ago
Text
"This piece aims to identify the pitfalls in thinking about what is being called an ‘algorithmic genocide’ in Gaza. I’d like to push against the exceptionalism afforded to AI; for example pieces which set military uses of AI as distinct from previous iterations of techno-warfare. Rather, the spectre of ‘artificial intelligence’ is a reification—a set of social relations in the false appearance of concrete form; something made by us which has been cast as something outside of us. And the way in which AI has been talked about in the context of a potentially ‘AI-enabled’ genocide in Gaza poses a dangerous distraction. All of the actually interesting and hard problems about AI, besides all the math, lie in its capacity as an intangible social technology and rhetorical device which elides human intention, creating the space of epistemic indeterminacy through which people act.
...The data does not “speak for itself”, neither in the context of academic research or in military applications.
Any ML model is, from its beginning, bound to a human conceptual apparatus.
...
The reification of AI, which happens at all points on the political spectrum, is actively dangerous in the context of its being taken to its most extreme conclusion: in the ‘usage' of ‘AI’ for mass death, as in the case of Gospel (‘Habsora’, הבשורה, named after the infallible word of God) and Lavender. This reification gives cover for politicians and military officers to make decisions about human lives, faking a hand-off of responsibility to a pile of linear algebra and in doing so handing themselves a blank check to do whatever they want. The extent to which these “AI systems” are credible or actually used is irrelevant, because the main purpose they serve is ideological, with massive psychological benefits for those pressing the buttons. Talking about military AI shifts the focus from the social relations between people to the technologies used to implement them, a mystification which misdirects focus and propagates invincibility.
There are things which are horrifying and exceptional about the current genocide, but the deployment of technology is not in itself one of those things; the usage of data-driven methods to conduct warfare is neither ‘intelligent’ nor ‘artificial’, and moreover not even remotely novel. As prior reporting from Ars Technica has shown about the NSA’s SKYNET program in Pakistan, Lavender is not even the first machine learning-driven system of mass assassination. I recently read Nick Turse’s Kill Anything That Moves: The Real American War in Vietnam (2013) and was struck by the parallels to the current campaign of extermination in Gaza, down to the directed-from-above obsession with fulfilling ‘body count’ as well as the creation of anarchic spaces in which lower-level operatives are afforded opportunities to carry out atrocities which were not explicitly ordered, an observation which has also been made of the Shoah. Thinking about it in this way allows us to fold AI into other discourses of technological warfare over the past century, such as the US’s usage of IBM 360 mainframe computers in Vietnam to similarly produce lists of targets under Operation Igloo White. Using technology as rhetorical cover for bureaucratized violence is not new.
The Lavender piece by Yuval Abraham states that IDF soldiers rapidly rubber-stamped bombing targets “despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 percent of cases”. But even if the error rate were 0.005% it wouldn’t matter, because the ‘precision’ canard is just laundering human intent through a justification-manufacturing apparatus which has zero technical component. Abraham reports that “sources who have used Lavender in recent months say human agency and precision were substituted by mass target creation and lethality,” but in reality exactly zero human agency has been removed. He writes that “once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence…AI did most of the work instead”, but this verbiage is a perverse reversal of cause and effect to create post-hoc justification.
...
Another line from the Gospel piece reads “the increasing use of AI based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale”. Emphasis mine—that word ‘allows’ is the hinge upon which this whole grotesque charade rests. The algorithm isn’t choosing anything; the choices already happened in the compiling and labeling of the dataset. The collecting and categorizing of data—which data on individuals’ social media or GPS movements or purchasing activity is to be used, which to be excluded—is in itself the construction of an elaborate ideological apparatus."
..
The purpose of a system is what it does, and science is a thing which people do
...We can expect the laundering of agency, whitewashed through the ideological device of 'the algorithm', to begin to be deployed in the arena of international law, given the ways in which Israel is already trying to sidestep the ‘genocidal intent’ it has been charged with at the ICJ. "The fetish of AI as a commodity allows companies and governments to sell it, particularly Israel, which still enjoys a fairly glowing reputation in the ML/AI industry and research world."
2 notes · View notes