#Mainframe Applications
Explore tagged Tumblr posts
shiprasharma2927 · 1 year ago
Text
Tumblr media
Explore modern mainframe migration strategies for a future-ready IT landscape. Embrace agility, cost-efficiency, and innovation.
0 notes
enterprisemobility · 1 year ago
Text
Digital Transformation Services
Your Switch to a Digital Future – Digital Transformation Consulting Services
Being a leading name amongst Digital Transformation Company and Service providers, Enterprise Mobility has been handholding enterprises on their Digital Transformation journeys for two decades now
1 note · View note
govindhtech · 1 year ago
Text
The Benefits of Mainframe Application Modernization
Tumblr media
The rapid development of innovative technologies, in conjunction with ever-increasing expectations from consumers and continuing disruptive market dynamics, is compelling businesses to place a greater emphasis than ever before on digital transformation. 67% of executive respondents to a recent survey conducted by the IBM Institute for Business Value in cooperation with Oxford Economics stated that their organizations need to transform quickly in order to keep up with the competition, while 57% reported that current market disruptions are placing unprecedented pressure on their IT. The survey was carried out by IBM Institute for Business Value.
Because digital transformation puts enormous demands on current applications and data, an enterprise’s heterogeneous technological environment, which may include cloud and mainframe computing, has to be modernized and integrated. It should come as no surprise that chief executive officers have listed the modernization of their companies’ technologies as one of their highest priorities. CEOs are looking to reinvent their goods, services, and operations in order to increase their organizations’ efficiencies, agility, and speed to market.
In order to run and create services in a consistent manner throughout their hybrid cloud environments, businesses want platforms that are flexible, secure, open, and tailored to their specific needs. Since mission-critical applications continue to take advantage of the capabilities offered by mainframes, the mainframe will continue to be an important component in this process. A hybrid best-fit method is one that supports the modernisation, integration, and deployment of applications. This kind of solution often incorporates both mainframes and the cloud. This improves the agility of the company and tackles the pain points that clients have, such as minimizing the talent gap, shortening the time it takes to bring a product to market, improving access to mission-critical data across platforms, and optimizing expenses.
According to the findings of recent study conducted by the IBM Institute for Business Value, over seven out of ten IT executives believe that mainframe-based applications are an integral part of their corporate and technological strategy. In addition to this, the majority of respondents (68%) believe that mainframes are an essential component of their hybrid cloud approach.
However, modernisation can be a difficult process, and businesses often find themselves up against a variety of obstacles. A poll of CEOs found that about 70 percent of them believe the mainframe-based programs in their companies are outdated and in need of being updated. The survey also finds that businesses are twelve times more likely to use existing mainframe assets in the next two years rather than construct their application estates from scratch, which may be prohibitively expensive, dangerous, or time-consuming. According to a poll of executives working for companies that are now attempting to modernize their mainframe applications, the most difficult obstacle is a shortage of the necessary resources and expertise. When questioned about it two years ago, executives mentioned the high cost of mainframes as a key obstacle. However, this is no longer seen as the case, and instead, executives are searching for other sources of value from mainframes, such as resilience, optimization, and regulatory compliance.
Given that application modernization is necessary for businesses who are concentrating on “best-fit” transformation that spans across mainframe, cloud, or even generative AI, IT executives who are interested in revitalizing their mainframe modernization need to take a few crucial activities right now:
Take a strategy that involves iteration
Consider the characteristics of your sector and the amount of work you do as part of the planning process for integrating new and current settings. Collaborate with your business opponents to co-create a business case and a “best-fit” roadmap geared to fulfill your strategic objectives. Both of these documents should be developed to match your needs. Instead of going with a huge boom, ripping everything out and starting over, you should go with a gradual and ongoing strategy to modernisation.
Perform an analysis of your portfolio, then construct your strategy
Investigate the capabilities that determine the function of the mainframe in your company at the present time, as well as the ways in which those capabilities are connected to the larger ecosystem of hybrid cloud technologies. In addition, you should make it a priority to cross-skilling employees inside the business and rely on your partners to make up for any shortages in talent or resources, whether they are new or already present.
Leverage a number of different access points for application modernization
By employing application programming interfaces (APIs), you may help offer simple access to existing mainframe programs and data. Offer a consistent experience for software developers by combining open-source technologies with a simplified workflow that emphasizes agility. Build apps that are native to the cloud on the mainframe, and containerize existing applications.
Based on a 2021 survey update by the IBM Institute for Business Value (IBV), which conducted a double-blind poll of 200 IT executives in North America in April 2023, “Application modernization on the mainframe – Expanding the value of hybrid cloud transformation.”
0 notes
neesonl602 · 2 years ago
Link
Migrating legacy applications to the cloud is a long and difficult process. It requires a lot of expertise and resources in order to get it done right without any hiccups.
0 notes
vax-official · 2 months ago
Text
You might have heard of 32-bit and 64-bit applications before, and if you work with older software, maybe 16-bit and even 8-bit computers. But what came before 8-bit? Was it preceded by 4-bit computing? Were there 2-bit computers? 1-bit? Half-bit?
Well outside that one AVGN meme, half-bit isn't really a thing, but the answer is a bit weirder in other ways! The current most prominent CPU designs come from Intel and AMD, and Intel did produce 4-bit, 8-bit, 16-bit, 32-bit and 64-bit microprocessors (although 4-bit computers weren't really a thing). But what came before 4-bit microprocessors?
Mainframes and minicomputers did. These were large computers intended for organizations instead of personal use. Before microprocessors, they used transistorized integrated circuits (or in the early days even vacuum tubes) and required a much larger space to store the CPU.
And what bit length did these older computers have?
A large variety of bit lengths.
There were 16-bit, 32-bit and 64-bit mainframes/minicomputers, but you also had 36-bit computers (PDP-10), 12-bit (PDP-8), 18-bit (PDP-7), 24-bit (ICT 1900), 48-bit (Burroughs) and 60-bit (CDC 6000) computers among others. There were also computers that didn't use binary encoding to store numbers, such as decimal computers or the very rare ternary computers (Setun).
And you didn't always evolve by extending the bit length, you could upgrade from an 18-bit computer to a more powerful 16-bit computer, which is what the developers of early UNIX did when they switched over from the PDP-7 to the PDP-11, or offer 32-bit over 36-bit, which happened when IBM phased out the IBM 7090 in favor of the the System/360 or DEC phased out the PDP-10 in favor of the VAX.
144 notes · View notes
kathaynesart · 1 year ago
Note
wait, what happened to the side of donnies face that gave him that scar/need for a hearing aid?
It happened years ago while he was trying to hack into the Technodrome. This was done via a Krangified army helicopter they had managed to take down without completely destroying (and the only known way since the Technodrome itself is completely organic).
However, the attempt failed and his old headphones/visor burst from a power surge. It lead to some nasty burns/scarring on the side of his face and he is practically deaf in that one ear. It was also the incident that made him realize that he couldn't just depend on electronics and his usual tech to try and tap into the enemy's mainframe. That's when he decided to reach out to Barry to try and find more... "Krang applicable" methods of infiltration, eventually leading to the Project Shield and Spear we know today. Thanks for asking! I have way too much backstory that is probably never going to be touched upon in the main storyline, so always happy to talk about it when it's brought up
Tumblr media
999 notes · View notes
mewtwowarrior · 2 years ago
Text
This idea has been digging at me for a while, so I finally started it.
Underneath the Keep Reading is a Tron: Legacy timeline gathering bits and pieces from different sources.
What makes it unique from other timelines is that it includes approximate Grid Cycles for each year.
The Tron Wiki states that there are approximately 50 Grid cycles to one User year. I combined this with the Cycle dates from Tron: Evolution to come up with the Grid timeline. The dates may not be exact, but they should be close enough to give a general idea of when things happen relative to each other.
Some of those sources contradict each other, but I’ve left everything in.
I don’t think it’s completely finished, I’ll keep adding to it as I find more things.
If you’ve found something that I’ve missed that needs to go on it, please let me know! (Especially in regards to things relating to the Flynn Lives ARG. I know there’s some kind of timeline regarding the Tron universe that I’ve seen that I’ve been unable to source.) (There also seems to be a timeline in The Next Day that I haven’t incorporated yet.)
-
Sources: Tron: Legacy official chronology, Tron: Evolution’s Tron Files, the Tron Wiki Timeline, the Flynn Lives ARG Wiki, the Flynn Lives Timeline, the FlynnLives.com information, the WellMinsterAcademy information, the Encom Press Event information, and Tron: Betrayal.
I’ve labeled each piece of information as to where it comes from. Most entries are copied and pasted from the sources indicated.
-
1949 -Kevin Flynn born (Tron Files)
Early 1970’s -Dr. Walter Gibbs leaves academia to found a garage start-up company he calls ENCOM. (Chronology)
1980 -ENCOM creates its first mainframe. (Chronology)
-The Grid forms within the mainframe as a place where programs can freely interact and games are played by programs. (Chronology)
- Dr. Walter Gibbs creates the Master Control Program (MCP) to regulate the mainframe at ENCOM. (Chronology)
-Kevin Flynn earns his doctorate from Cal Tech. He is immediately hired by ENCOM where he quickly climbs the corporate ladder to become a lead software developer. (Chronology)
- Using ENCOM’s facilities, and without the knowledge of his superiors, Flynn designs several games, developing Space Paranoids, Matrix Blaster, Vice Squad, Light Cycles, and numerous other titles. (Chronology)
-Ed Dillinger fires Kevin Flynn, his rival at ENCOM, and takes credit for the games Flynn created under the radar. (Chronology)
-Money comes rolling in to ENCOM as a result of the games Kevin Flynn designed. (Chronology)
-Flynn and Dr. Lora Baines begin to date. It ends after a few months. (Chronology)
1981 -Flynn designed and created the Space Paranoids and Light Cycles games. (Tron Files)
-September 22, 1981 is potentially the day that Dillinger steals Flynn’s games (Found by fights4users here. movie-screencaps’ screenshot of Alan’s terminal and movie-screencaps’ screenshot of Flynn’s printout)
-Ed Dillinger quickly climbs the corporate ladder at ENCOM based on the success of the games he “stole” from Kevin Flynn and he becomes Senior Executive Vice President of ENCOM. (Chronology)
-Dillinger demotes Dr. Walter Gibbs, founder of ENCOM. Gibbs uses his free time to begin research on practical applications of quantum mechanics and lasers. (Chronology)
-The Master Control Program evolves and gains control in the Grid. It begins consuming programs beyond its network in the real world and sends unneeded ones into the Game Grid to ultimately be destroyed. (Chronology)
-Flynn buys an old arcade, calling it Flynn’s Arcade, uses his own games as the focal point of his business. (Chronology)
-Alan Bradley creates Tron 1.0 (Tron Files)
1982
-September 22, 1982 is potentially the day that Flynn, Alan, and Lora break into Encom. (Found by fights4users here. movie-screencaps’ screenshot of Alan’s terminal and movie-screencaps’ screenshot of Flynn’s printout)
-Kevin Flynn hacks the ENCOM mainframe with his search program Clu to find evidence of Ed Dillinger’s wrongdoing. (Chronology)
-The Master Control Program, the overlord of the system at ENCOM, finds and derezzes Clu before he could access the data he was looking for. (Chronology)
-Alan Bradley, a high-level programmer at ENCOM, has suspicions and creates a program named Tron to monitor the Master Control Program to ensure it stays in line. (Chronology)
-Part of the Flynn Lives ARG states that Alan created Tron in 1982. (Flynn Lives ARG Wiki)
-Alan Bradley complains to his co-worker and girlfriend, Dr. Lora Baines, about Ed Dillinger and losing network access at work due to a hacker. Suspecting Kevin Flynn is the hacker, Lora convinces Alan to help warn him. Together, they break into ENCOM so Kevin Flynn can gain access to the mainframe. (Chronology)
-Kevin Flynn is detected in the Grid by the Master Control Program while at a terminal in the Laser Bay. (Chronology)
-After being digitized by a laser into the Grid by the Master Control Program, Kevin Flynn teams up with Alan Bradley’s and Lora Baines’ program avatars in the system — Tron and Yori. Together, they overcome the MCP and stop the corruption of the digital realm. (Chronology)
-When Kevin Flynn is digitized back to the real world, Kevin Flynn has the evidence that he, not Ed Dillinger, wrote the games the company was famous for. (Chronology)
1983 - TC1
-Kevin Flynn becomes the Chief Executive Officer of ENCOM and begins work on a new digital realm — the TRON system. He recreates many programs based on familiar ones in the ENCOM system but with his own flare and ingenuity. (Chronology)
TC1 - Kevin Flynn developed the Grid (and continued development afterwards) (Tron Files)
-Flynn hires Alan Bradley as Chief Operating Officer of ENCOM. (Chronology)
-The first program Kevin Flynn creates is a simple resource distribution platform called Shaddox. (Chronology)
TC1 - Upgraded version of Tron (2.0) brought into the Grid (Tron Files) TC1 - Tron City created by Flynn (it was regularly modified by Clu after that) (Tron Files) TC1 - Clu created (Tron Files) -Tron: Betrayal has Clu created on April 17, 1983. (Tron: Betrayal) (Tron: Betrayal lists April 17th as a Tuesday, which doesn't happen until 1984. April 17 is on a Sunday in 1983.)
-Flynn recreates CLU, now as a control program, to watch over the TRON System when he is not inside. (Chronology)
TC15 - Game Grid created (Tron Files)
TC29 - Radia/Ophelia created (Tron Files) -Radia is the first ISO on the Grid. (Tron: Betrayal) TC29 - Jalen created (Tron Files) TC29 - NAVI BIT created (Tron Files)
-This is the earliest point that Flynn could’ve started bringing ISOs to the ARQ X0711 system. It’s likely that this happened later in the timeline when things got bad between the Basics and ISOs, but there’s been no concrete timeframe given yet. (Tron: Identity)
TC30 - Zuse created (Tron Files)
-Flynn is gone for weeks from the Grid, Clu indicates that he's been gone for hundreds of cycles. He's on the Grid when Sam is born. He had set up a way for Jordan to call him, when she does so, Clu makes a note that Flynn had previously said it was impossible to contact the Grid from the User world. (Tron: Betrayal)
-Sam Flynn is born to Kevin Flynn and Jordan Canas. (Chronology)
-Jordan Canas dies in a car accident. (Chronology) -A letter that's part of the Flynn Lives ARG says she died in September. (Flynn Lives ARG Wiki)
1984 - TC51
1985 - TC101
-ENCOM has gone public and become the largest video game company in the world. (Chronology)
-Flynn retires from game design to pursue digital research exclusively and focuses his energies on the TRON system ? easily traveling in and out of the system by laser technology. (Chronology)
-Part of the Flynn Lives ARG states that Flynn has been widowed since 1985 (Flynn Lives ARG Wiki) -A letter that's part of the Flynn Lives ARG says Jordan died in September. (Flynn Lives ARG Wiki)
TC146 - Arjia created by Jalen (Tron Files)
-Tron: Evolution (PSP) has to take place after Arjia was created. (Stated in-game date is 1985)
TC148 - Gibson created (Tron Files)
1986 - TC151
-ENCOM becomes an established powerhouse in computing and game culture. (Chronology)
TC162 - Bostrum Colony created (Tron Files)
-Tron: Evolution (DS) has to take place sometime after Bostrum was created.
1987 - TC201
1988 - TC251
-Flynn writes and publishes a controversial book “Digital Frontier.” (Chronology)
-Flynn claims to have stumbled upon an incredible discovery that could change the world and promises to reveal details “soon.” (Chronology)
Approximately TC270 - Quorra created (Tron Files)
-Clu's poisoning of the Sea of Simulation has to take place sometime after Quorra is created. (Tron: Betrayal)
-Tron: Evolution: Battle Grids (Wii) has to take place after Quorra is created and before Jalen's "accident".
TC296 - Jalen’s “accident” in the Grid Games and taken captive by Clu (Tron Files)
1989 - TC301
TC301 - Population in Tron City: 16,453,479 (Majority is Basic programs) (Tron Files) TC301 - Population in Arjia: 512,486 (Home to both Basics and ISOs) (Tron Files) TC301 - Population in Bostrum Colony: 6,953 (Bostrumite ISOs only) (Tron Files) TC301 - Beta test version of Anon installed (Tron Files) TC301 - The events of Tron: Evolution (PC/Xbox360/PS3) take place. -According to Tron: Uprising, the date is November 3, 1989. (Tron Wiki) -A letter that's part of the Flynn Lives ARG has mentions Flynn's disappearance by November 6, 1989 (Flynn Lives ARG Wiki) -Another letter that's part of the Flynn Lives ARG mentions Flynn disappeared in July. (Flynn Lives ARG Wiki)
-In the real world, Flynn disappears completely, leaving his son and company adrift. (Chronology)
-Flynn Lives reports "Kevin Flynn alleged to "disappear." Initial facts raise many questions. Many of us were suspicious." (Flynn Lives ARG Wiki)
-Guardianship of Sam goes to his paternal grandparents. (Chronology)
-With Kevin Flynn gone, the ENCOM board votes Alan Bradley as interim Chief Executive Officer. (Chronology)
1990 - TC351
-Many sightings of Kevin Flynn are reported, but none are confirmed. (Chronology)
-The Flynn Lives! movement begins in earnest. (Chronology)
-Flynn Lives reports "Sightings of Kevin Flynn by ordinary citizens, including high-credibility "Level 3" sightings of Flynn in NYC's Central Park during a Shakespeare Festival, on the fringes of a San Francisco street fair, and the notorious "Elvira" sighting of Kevin Flynn at Halloween celebrations in West Hollywood, California. Unfortunately, these initial sightings display certain characteristics true to this day -- nothing has been confirmed and photographic evidence has been lacking." (Flynn Lives ARG Wiki)
1991 - TC401
1992 - TC451
-Flynn Lives reports "Sightings continue. Several of us make contact thru Usenet and begin correspondence." (Flynn Lives ARG Wiki)
1993 - TC501
1994 - TC551
-First Flynn Lives! meeting is held in Dayton, Ohio. The group organizes efforts to find the truth behind Kevin Flynn’s mysterious disappearance. (Chronology)
-Flynn Lives reports "First Flynn Lives! meet-up in Dayton, Ohio. We resolve to continue our efforts to find out the facts behind the mysterious disappearance." (Flynn Lives ARG Wiki)
1995 - TC601
-Sam’s grandfather dies. (Chronology)
1996 - TC651
1997 - TC701
-In June, it was noted that Sam Flynn started studying capoeira. (Flynn Lives ARG Wiki)
1998 - TC751
-A letter from Kevin Flynn to a founding member of the Flynn Lives! Movement gains media attention, but is subsequently proven a hoax. The recipient of the letter is institutionalized. (Chronology)
-Flynn Lives reports "Letter from Kevin Flynn to a founding member of the group gains media attention, then debunked. Founding member (now ex-member) checks into a mental hospital for observations." (Flynn Lives ARG Wiki)
1999 - TC801
2000 - TC851
-Sam’s grandmother dies. (Chronology)
-A letter dated May 30th states that Sam has a spot at CalTech for the next year and that he has completed all the requirements in his plan of study. And, that he had continued to practice capoeira. (Flynn Lives ARG Wiki)
2001 - TC901
-A $5,000 reward is offered by the Flynn Lives! group to anyone who can provide proof that Kevin Flynn is alive. (Chronology)
-Flynn Lives reports "A $5,000 award is offered to anybody who can prove Kevin Flynn is alive. By December 31st, alas, nobody had satisfied our jury and the money was spent on a great party for all of the "Troniacs" we know and love!" (Flynn Lives ARG Wiki)
2002 - TC951
-Conflict is brewing in the TRON system. (Chronology) (This entry in the Chronology makes me wonder if this is when Tron: Uprising takes place.)
-Flynn Lives reports "An era of low visibility for our group. Sightings drop off, and interest seems to slacken. Thank heavens that is over!" (Flynn Lives ARG Wiki)
2003 - TC1001
-Flynn Lives reports "An era of low visibility for our group. Sightings drop off, and interest seems to slacken. Thank heavens that is over!" (Flynn Lives ARG Wiki)
2004 - TC1051
-Flynn Lives reports "An era of low visibility for our group. Sightings drop off, and interest seems to slacken. Thank heavens that is over!" (Flynn Lives ARG Wiki)
2005 - TC1101
-Flynn Lives reports "An era of low visibility for our group. Sightings drop off, and interest seems to slacken. Thank heavens that is over!" (Flynn Lives ARG Wiki)
2006 - TC1151
-Alan Bradley is stripped of his power as Chief Executive Officer of ENCOM, but is allowed to remain in the company as a figurehead, the Chairman Emeritus. (Chronology)
2007 - TC1201
-The “Albino Cow” sighting of Kevin Flynn sparks renewed interest in the Flynn Lives! group. (Chronology)
-Flynn Lives reports "Interest picks up as the "Albino Cow" Flynn sighting in southern New Jersey energizes a new generation of activists." (Flynn Lives ARG Wiki)
2008 - TC1251
-Uninterested in the family business, Sam Flynn chooses a path of extreme sports and daring stunts. (Chronology)
-Sam Flynn’s preferred vehicle is his father’s old Ducati motorcycle. (Chronology)
2009 - TC1301
-Flynn Lives ARG starts (Flynn Lives ARG Wiki)
2010 - TC1351
-ENCOM is the largest multinational computer technology company in the world. (Chronology)
-Flynn Lives! Organization reveals new information about Kevin Flynn and follows traces of evidence of his mysterious disappearance. (Chronology)
-April 2, 2010 - The Encom Press Event takes place (Flynn Lives ARG Wiki) Video: Part 1 and Part 2 -October 8, 2010 - A temporary portal to the Grid is opened in Disney’s California Adventure - ElecTRONica (Yesterland’s ElecTRONica page)
-December 8, 2010 - Flynn Lives opened the connection that made it possible for the page to be sent to Alan. (Flynn Lives ARG Wiki)
-A mysterious message is sent to Alan Bradley’s old pager — the phone number it came from is the now-abandoned Flynn’s Arcade. (Chronology)
-Solar Sailer Prisoners takes place shortly before the events of Tron: Legacy. (The Tron Wiki)
-The events of Tron: Legacy take place in 2010, possibly on December 17, the release date of the movie.
-Tron: The Next Day takes place. (Tron Wiki) Video: DailyMotion
2011 - TC1401
2012 - TC1451
-April 15, 2012 - The temporary portal to the Grid in Disney’s California Adventure is closed - ElecTRONica (Yesterland’s ElecTRONica page)
2013 - TC1501
2014 - TC1551
2015 - TC1601
2016 - TC1651
-June 16, 2016 - Sam Flynn opens a gateway to the Grid in Shanghai Disneyland - TRON Lightcycle Power Run (DisneyParks Blog and Roller Coaster DataBase)
2017 - TC1701
2018 - TC1751
2019 - TC1801
2020 - TC1851
2021 - TC1901
2022 - TC1951
2023 - TC2001
-April 4, 2023 - Sam Flynn opens a second gateway to the Grid at Walt Disney World’s Magic Kingdom - TRON Lightcycle / Run (DisneyParks Blog and Roller Coaster DataBase)
-The events of Tron: Identity take place in 2023, possibly on April 11, the release date of the game. (Gizmodo Tron: Identity Interview)
2024 - TC2051
2025 - TC2101
89 notes · View notes
prokhorvlg · 2 years ago
Text
Tumblr media
A Datanet advertisement for a specialty microcomputer from the early 2020s
With the cybernetic revolution raging across the world, other digital technologies evolved to support it rather than forming an identity of their own. Given that cybernetics was expected to eventually replace all human-computer interaction, investment into other methods was rare.
By the 2020s, the Datanet existed but primarily for the machine and its programmer. Gigastreams flowed from node to node, carrying terabytes of data between mainframes, robots, and microcomputers. The signals they carried formed the unconscious backbone of society, underground and mostly out of sight.
Between the gigastreams, there existed a space for the human users. The vast majority would be using specialized applications to access electronic conferences, entertainment downloads, interactive encyclopedias, and similar use cases.
The few that ventured further into the machine-facing cyberspace were specialists: cyberneticists, programmers, tinkerers, digital archeologists. It wouldn't be until the first teleindexer — the PAL, from Maple Cybernetic — that the Datanet would be placed into the human palm, fundamentally changing daily life one more time.
109 notes · View notes
trivialbob · 1 year ago
Text
Tumblr media
For work I have to sign in through a virtual desktop. I hate it.
Compared to pre-virtual desktop days it takes me longer to get signed in. There's often a lag time between pressing a keyboard button and a letter or number appearing on my screen. This delay is a small fraction of a second. But I notice it, and if I type many characters the delay becomes more noticeable.
Death by a thousand cuts.
This morning the virtual desktop system isn't working at all. Under the old way, where we had each application installed on our laptops, if something wasn't working with, for example, the mainframe connection I could still get into Outlook, and vice versa. Virtual desktop is all or zero.
And I'm a zero today.
As I wait I got out my air compressor and started to clean things. Computer keyboards. Vacuum cleaners. That ridge around the top of the Instant Pots. Coffee makers (I have three different types).
I find it soothing (though noisy) getting things clean down to the smallest nooks and crannies.
I probably *cough* should have worn a mask though.
52 notes · View notes
lesbianchemicalplant · 1 year ago
Text
The Court of Appeal of Brussels has made an interesting ruling. A customer complained that their bank was spelling the customer's name incorrectly. The bank didn't have support for diacritical marks. Things like á, è, ô, ü, ç etc. Those accents are common in many languages. So it was a little surprising that the bank didn't support them. The bank refused to spell their customer's name correctly, so the customer raised a GDPR complaint under Article 16.
“The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her.”
Cue much legal back and forth. The bank argued that they simply couldn't support diacritics due to their technology stack. Here's their argument (in Dutch - my translation follows)
Tumblr media
“Bank X also explained that the current customer data management application was launched in 1995 and is still running on a US manufactured mainframe system. This system only supported EBCDIC (“extended binary-coded decimal interchange code”). This is an 8-bit standard for storing letters and punctuation marks, developed in 1963-1964 by IBM for their mainframes and AS/400 computers. The code comes from of the use of punch cards and only contains the following characters…”
(Emphasis added.) EBCDIC is an ancient (and much hated) “standard” which should have been fired into the sun a long time ago. It baffles me that it was still being used in 1995 - let alone today. Look, I'm not a lawyer (sorry mum!) so I've no idea whether this sort of ruling has any impact outside of this specific case. But, a decade after the seminal Falsehoods Programmers Believe About Names essay - we shouldn't tolerate these sorts of flaws. Unicode - encoded as UTF-8 - just works. Yes, I'm sure there are some edge-cases. But if you can't properly store human names in their native language, you're opening yourself up to a lawsuit. Source GDPRhub - 2019/AR/1006
The Court of Appeal of Brussels held that, in accordance with Article 16 GDPR, the data subject has the right for their name to be correctly spelled when processed by the computer systems of the Bank. To claim in 2019 that adapting a computer system to correctly handle diacritics would cost several months of work and/or constitute additional costs for the Bank, does not allow the Bank to disregard the rights of the data subject. A correctly functioning banking institution may be expected to have computing systems that meet current standards, including the right to correct spelling of people's names.
(decided on September 10th, 2019)
Extended Binary Coded Decimal Interchange Code (EBCDIC)is an eight-bit character encoding used mainly on IBM mainframe and IBM midrange computer operating systems. It descended from the code used with punched cards and the corresponding six-bit binary-coded decimal code used with most of IBM's computer peripherals of the late 1950s and early 1960s. [...] While IBM was a chief proponent of the ASCII standardization committee,[4] the company did not have time to prepare ASCII peripherals (such as card punch machines) to ship with its System/360 computers, so the company settled on EBCDIC.
literally pre-ascii 😶
29 notes · View notes
shiprasharma2927 · 1 year ago
Text
Mainframe Performance Optimization Techniques
Tumblr media
Mainframe performance optimization is crucial for organizations relying on these powerful computing systems to ensure efficient and cost-effective operations. Here are some key techniques and best practices for optimizing mainframe performance:
1. Capacity Planning: Understand your workload and resource requirements. Accurately estimate future needs to allocate resources efficiently. This involves monitoring trends, historical data analysis, and growth projections.
2. Workload Management: Prioritize and allocate resources based on business needs. Ensure that critical workloads get the necessary resources while lower-priority tasks are appropriately throttled.
3. Batch Window Optimization: Efficiently schedule batch jobs to maximize system utilization. Minimize overlap and contention for resources during batch processing windows.
4. Storage Optimization: Regularly review and manage storage capacity. Employ data compression, data archiving, and data purging strategies to free up storage resources.
5. Indexing and Data Access: Optimize database performance by creating and maintaining efficient indexes. Tune SQL queries to minimize resource consumption and improve response times.
6. CICS and IMS Tuning: Tune your transaction processing environments like CICS (Customer Information Control System) and IMS (Information Management System) to minimize response times and resource utilization.
7. I/O Optimization: Reduce I/O bottlenecks by optimizing the placement of data sets and using techniques like buffering and caching.
8. Memory Management: Efficiently manage mainframe memory to minimize paging and maximize available RAM for critical tasks. Monitor memory usage and adjust configurations as needed.
9. CPU Optimization: Monitor CPU usage and identify resource-intensive tasks. Optimize code, reduce unnecessary CPU cycles, and consider parallel processing for CPU-bound tasks.
10. Subsystem Tuning: Mainframes often consist of various subsystems like DB2, z/OS, and MQ. Each subsystem should be tuned for optimal performance based on specific workload requirements.
11. Parallel Processing: Leverage parallel processing capabilities to distribute workloads across multiple processors or regions to improve processing speed and reduce contention.
12. Batch Processing Optimization: Optimize batch job execution by minimizing I/O, improving sorting algorithms, and parallelizing batch processing tasks.
13. Compression Techniques: Use compression algorithms to reduce the size of data stored on disk, which can lead to significant storage and I/O savings.
14. Monitoring and Performance Analysis Tools: Employ specialized tools and monitoring software to continuously assess system performance, detect bottlenecks, and troubleshoot issues in real-time.
15. Tuning Documentation: Maintain comprehensive documentation of configuration settings, tuning parameters, and performance benchmarks. This documentation helps in identifying and resolving performance issues effectively.
16. Regular Maintenance: Keep the mainframe software and hardware up-to-date with the latest patches and updates provided by the vendor. Regular maintenance can resolve known performance issues.
17. Training and Skill Development: Invest in training for your mainframe staff to ensure they have the skills and knowledge to effectively manage and optimize the system.
18. Cost Management: Consider the cost implications of performance tuning. Sometimes, adding more resources may be more cost-effective than extensive tuning efforts.
19. Capacity Testing: Conduct load and stress testing to evaluate how the mainframe handles peak workloads. Identify potential bottlenecks and make necessary adjustments.
20. Security Considerations: Ensure that performance optimizations do not compromise mainframe security. Balance performance improvements with security requirements.
Mainframe performance optimization is an ongoing process that requires constant monitoring and adjustment to meet evolving business needs. By implementing these techniques and best practices, organizations can maximize the value of their mainframe investments and ensure smooth and efficient operations.
0 notes
foundationhq · 9 months ago
Text
Tumblr media
ACCESS GRANTED TO SITE-φ.
Welcome, 𝐻𝐴𝑅𝑃𝐸𝑅. 𝚃𝚑𝚎 𝙰𝚍𝚖𝚒𝚗𝚒𝚜𝚝𝚛𝚊𝚝𝚘𝚛 is pleased to scout you for the role of [𝑄𝑈𝑂𝑇𝐸 𝑈𝑁𝑄𝑈𝑂𝑇𝐸].
A Succinctly Candid Perspective. A Subtly Crafted Pretense. A Superbly Charismatic Person. Matias — “Loch” — is an amazing find for the Foundation, as well as for us reading your application. It is not lost on us that this role was one of the Foundation's newest acquisitions, and yet your take on him was so intimate, familiar to us. The thorough rapport one feels for a relationship spanning years, every quip so quotable. Loch’s vision, a most singular scope found in his story, elevated him to gleeful heights. Where that zest for anomalies comes from is removed from just fantastical fancy. There is real love in there, protective and strong for the causes he believes in and the people he adores. Really, an absolute treat of an app. We’d let him hack our mainframe any day. Please don’t, though. We need that. We are so incredibly happy to invite you into the Foundation.
Please refer to our checklist for primary onboarding, and have your account ready in 24 hours. The flight to Site-φ leaves on the dot. And 𝚃𝚑𝚎 𝙰𝚍𝚖𝚒𝚗𝚒𝚜𝚝𝚛𝚊𝚝𝚘𝚛 doesn't like to be kept waiting.
5 notes · View notes
golmac · 1 year ago
Text
More Inform Basics (#2)
Last time, I talked about getting a project ready. We're still doing that, but I need to take a minute to talk about output. There aren't a lot of options for configuring the appearance of text output in default Inform.
If this seems surprising, remember that the original prototypes for parser games, Adventure and Zork, were written to be experienced via mainframe terminals. Later iterations upon this model used virtual machines that prioritized portability over system-specific features. There were tons of micros in the 80s, and each did things their own way. This method made it possible to port games to many different systems.
The bottom line is that most features related to appearance (font, colors, etc.) are not in the "story" at all*; they're handled by what is called the "interpreter." The story is the actual game file. An interpreter executes the game. The story is platform-agnostic; the interpreter is system-specific. In an interpreter application, a player can change some formatting settings themselves. In a browser-based interpreter, the author can mess with CSS if they are in a position to publish the game somewhere, since web browsers don't have settings specific to parser games.
[*Technically, there are ways to set colors in an Inform game, but the method doesn't work with all interpreters. Unless you publish a web-playable version, you can't control which interpreters players use.]
Whew, TL;DR: vanilla Inform's DNA is descended from a mainframe terminal model. Why is this important? Well, there are still some things under your control as an author, and it's best to do this in a simple, readable way.
Almost everything the player sees is coded with a "say" construction. If we want to print some text just as the game begins (before the player's first turn), we could just:
when play begins:      say "Thank you for playing my game!".
[note: I couldn’t find a way to insert a tab, so I used five spaces instead. tabs are important in Inform 7!]
Let's say we wanted to say something after the player examines a lamp:
after examining the lamp: say "I think we did the right thing, choosing to examine that lamp."
Trying to tie everything together: we have a condition that gets set at compile time (test vs. production). We talked about this last time:
PROD is a truth state that varies.
Next time, we'll try using the PROD condition while tweaking output. It's pretty straightforward!
12 notes · View notes
watching-pictures-move · 9 months ago
Text
Put On Your Raincoats | Randy: The Electric Lady (Schuman & Strong, 1980)
Tumblr media
This is the second movie I’ve seen co-directed by Zachary Strong, and like Little Showoffs, there’s a deconstructive streak. That movie was about exploring the fantasies of its participants, alternating between interviews and enactments, and eventually pulling back the curtain to show the work that goes into making these things. That movie’s overall attitude was fairly warm and supportive. This one’s, maybe less so. Here we find ourselves at some kind of institute of sexual studies where scientists are doing some very scientific research on the science of orgasms. For science, you see.
How it’s presented is any number of beautiful women in the throes of ecstasy, subject to stimuli either external or self-applied, while strapped to electrodes and the like, so they can be observed. For science, you see. Now, one might not take too much issue being conflated with the nice looking ladies in the cast, but one might object a little more to their relative lack of agency. The movie is softening the blow, but it’s still poking fun at you the viewer. The height of these jabs comes during a sequence where the subjects are forced to watch specially edited "commercial porno films" and masturbate, and what we see of the films is played so quickly and cut so incoherently, that one wonders how anyone can get off on it, which I suppose is the folly of skipping to the good parts. One of the scientists admits that pornography doesn’t do anything for him, as “there’s never any story” and you never get to know the characters, and the movie’s satire comes into focus. And even a character’s eventual sexual actualization is defined by a number of preprogrammed stimuli and positions.
The plot eventually turns to the extraction of “Orgasmine”, which allows two of the scientists to experience sex with each other entirely in their minds without actual physical interaction. One can speculate what sort of applications this might have in the name of science, but one should be wary lest it be used for less altruistic motives, such as world domination. Which may or may not be the motive of a mad scientist played by uberMILF Juliet Anderson, who may or may not want “enough Orgasmine to control the vurld!”.
Listen, you get Anderson chewing the scenery with a shitty German accent and a femdom lite routine, and this is automatically a good movie. You get the cute as a button Desiree Cousteau as the titular character (yes, Randy is a girl’s name here) and you have an even better movie. You get Lori Blue and Jesie St. James in the supporting cast and I certainly ain’t complaining. But what takes this to the next level is the forceful style with which this is executed. There’s an emphasis on sensory overload, like a sequence that cuts between Cousteau’s garden fantasy to her masturbating frantically to a pair of scientists fucking while we’re hit with colour effects and punchcards explode out of a computer (another scientist warns them “There’s no fucking in the sex institute!”). Or the porno sequence where we’re hit with a barrage of explicit imagery while punk rock blares on the soundtrack. Or the scene the dissolve-heavy scene where Cousteau and Blue masturbate together while reminiscing about past lovers, which has probably the best cross-cutting during sex scenes I’ve seen in porno.
The horror movie aesthetics and mix of coldness and camp invite comparisons with another pornographic favourite of mine, Nightdreams, although this leans heavier on the camp than the coldness. The lab setting, which includes a sassy mainframe computer and a mechanized dildo, on top of all the monitors, tape machines and cold interiors, help give this a distinct visual identity. So much so that the heavily degraded print, which alternated between giving the movie a gummy candy green and steely blue colour palette, didn’t manage to sink it. Ideally we’ll get a restoration of this at some point, but in any case it’s well worth a look.
5 notes · View notes
mitchipedia · 1 year ago
Text
My latest article: An IBM watsonx AI tool helps refactor COBOL mainframe code into Java, to make it easier to maintain and extend for folks who entered the workforce after the disco era.
5 notes · View notes
mariacallous · 1 year ago
Text
In the first four months of the Covid-19 pandemic, government leaders paid $100 million for management consultants at McKinsey to model the spread of the coronavirus and build online dashboards to project hospital capacity.
It's unsurprising that leaders turned to McKinsey for help, given the notorious backwardness of government technology. Our everyday experience with online shopping and search only highlights the stark contrast between user-friendly interfaces and the frustrating inefficiencies of government websites—or worse yet, the ongoing need to visit a government office to submit forms in person. The 2016 animated movie Zootopia depicts literal sloths running the DMV, a scene that was guaranteed to get laughs given our low expectations of government responsiveness.
More seriously, these doubts are reflected in the plummeting levels of public trust in government. From early Healthcare.gov failures to the more recent implosions of state unemployment websites, policymaking without attention to the technology that puts the policy into practice has led to disastrous consequences.
The root of the problem is that the government, the largest employer in the US, does not keep its employees up-to-date on the latest tools and technologies. When I served in the Obama White House as the nation’s first deputy chief technology officer, I had to learn constitutional basics and watch annual training videos on sexual harassment and cybersecurity. But I was never required to take a course on how to use technology to serve citizens and solve problems. In fact, the last significant legislation about what public professionals need to know was the Government Employee Training Act, from 1958, well before the internet was invented.
In the United States, public sector awareness of how to use data or human-centered design is very low. Out of 400-plus public servants surveyed in 2020, less than 25 percent received training in these more tech-enabled ways of working, though 70 percent said they wanted such training. 
But knowing how to use new technology does not have to be an afterthought, and in some places it no longer is. In Singapore, the Civil Service Training College requires technology and digital-skills training for its 145,000 civilian public servants. Canada’s “Busrides” training platform gives its quarter-million public servants short podcasts on topics like data science, AI, and machine learning to listen to during their commutes. In Argentina, career advancement and salary raises are tied to the completion of training in human-centered design and data-analytical thinking. When public professionals possess these skills—learning how to use technology to work in more agile ways, getting smarter from both data and community engagement—we all benefit.
Today I serve as chief innovation officer for the state of New Jersey, working to improve state websites that deliver crucial information and services. When New Jersey’s aging mainframe strained under the load of Covid jobless claims, for example, we wrote forms in plain language, simplified and eliminated questions, revamped the design, and made the site mobile-friendly. Small fixes that came from sitting down and listening to claimants translated into 48 minutes saved per person per application. New Jersey also created a Covid-19 website in three days so that the public had the information they wanted in one place. We made more than 134,000 updates as the pandemic wore on, so that residents benefited from frequent improvements.
Now with the explosion of interest in artificial intelligence, Congress is turning its attention to ensuring that those who work in government learn more about the technology. US senators Gary Peters (D-Michigan) and Mike Braun (R-Indiana) are calling for universal leadership training in AI with the AI Leadership Training Act, which is moving forward to the full Senate for consideration. The bill directs the Office of Personnel Management (OPM), the federal government's human resources department, to train federal leadership in AI basics and risks. However, it does not yet mandate the teaching of how to use AI to improve how the government works.
The AI Leadership Training Act is an important step in the right direction, but it needs to go beyond mandating basic AI training. It should require that the OPM teach public servants how to use AI technologies to enhance public service by making government services more accessible, providing constant access to city services, helping analyze data to understand citizen needs, and creating new opportunities for the public to participate in democratic decisionmaking.
For instance, cities are already experimenting with AI-based image generation for participatory urban planning, while San Francisco’s PAIGE AI chatbot is helping to answer business owners' questions about how to sell to the city. Helsinki, Finland, uses an AI-powered decisionmaking tool to analyze data and provide recommendations on city policies. In Dubai, leaders are not just learning AI in general, but learning how to use ChatGPT specifically. The legislation, too, should mandate that the OPM not just teach what AI is, but how to use it to serve citizens.
In keeping with the practice in every other country, the legislation should require that training to be free. This is already the case for the military. On the civilian side, however, the OPM is required to charge a fee for its training programs. A course titled Enabling 21st-Century Leaders, for example, costs $2,200 per person. Even if the individual applies to their organization for reimbursement, too often programs do not have budgets set aside for up-skilling.
If we want public servants to understand AI, we cannot charge them for it. There is no need to do so, either. Building on a program created in New Jersey, six states are now collaborating with each other in a project called InnovateUS to develop free live and self-paced learning in digital, data, and innovation skills. Because the content is all openly licensed and designed specifically for public servants, it can easily be shared across states and with the federal government as well.
The Act should also demand that the training be easy to find. Even if Congress mandates the training, public professionals will have a hard time finding it without the physical infrastructure to ensure that public servants can take and track their learning about tech and data. In Germany, the federal government’s Digital Academy offers a single site for digital up-skilling to ensure widespread participation. By contrast, in the United States, every federal agency has its own (and sometimes more than one) website where employees can look for training opportunities, and the OPM does not advertise its training across the federal government. While the Department of Defense has started building USALearning.gov so that all employees could eventually have access to the same content, this project needs to be accelerated.
The Act should also require that data on the outcomes of AI training be collected and published. The current absence of data on federal employee training prevents managers, researchers, and taxpayers from properly evaluating these training initiatives. More comprehensive information about our public workforce, beyond just demographics and job titles, could be used to measure the impact of AI training on cost savings, innovation, and performance improvements in serving the American public.
Unlike other political reforms that could take generations to achieve in our highly partisan and divisive political climate, investing in people—teaching public professionals how to use AI and the latest technology to work in more agile, evidence-based, and participatory ways to solve problems—is something we can do right now to create institutions that are more responsive, reliable, and deserving of our trust.
I understand the hesitance to talk about training people in government. When I worked for the Obama White House, the communications team was reluctant to make any public pronouncements about investing in government lest we be labeled “Big Government” advocates. Since the Reagan years, Republicans have promoted a “small government” narrative. But what matters to most Americans is not big or small but that we have a better government.
6 notes · View notes