#those names were practically a secret code and obfuscation was the point
Explore tagged Tumblr posts
mumblesplash · 2 years ago
Text
ohh ok i see where you’re coming from with this ty for explaining!
see i only showed up january of this year, and to me the -duo naming convention seemed like it was just How People Do Things? like a tacked-on suffix that effectively meant ‘oh btw these are mcyt personas’ and also, like you said, had the benefit of carrying no implication that they were being shipped together. and being as new as i was, i kinda liked having the first part of the name as a signifier of which specific event or series of events made that pair dynamic popular.
what’s interesting to me is that obviously the ‘duo’ convention isn’t necessary for that last part, but it *does* have the effect of p consistently focusing the name on the pairings’ origin stories? (which i guess makes sense if it’s a carry-over from plot point specific names from the dsmp)
portmanteaus or full names make it more obvious *who* is being talked about, but not *why*, and in my experience the ‘why’ was always much harder to pin down. sometimes with good reason—i haven’t encountered a duo name alternative for grumbo yet, and other than predating the dsmp i’d bet that’s because their dynamic evolved over 4 entire seasons of hermitcraft and doesn’t really revolve around anything specific other than their friendship—but in a lot of cases imo it makes sense for a pair’s name being a direct reference to their main Thing.
again, not in favor of the duo names necessarily (‘rancher duo’ doesn’t add any more context than ‘team rancher’ and personally i tend to prefer the canon team names when they exist), but i do think they caught on because they serve a purpose. and if so it’s probably worth figuring out exactly what it is, because even though the alternatives are more creative and fun the duo names aren’t going to stop until there’s something else that serves that purpose better
Tumblr media
Guys...guys. It needs to stop.
471 notes · View notes
recentnews18-blog · 6 years ago
Photo
Tumblr media
New Post has been published on https://shovelnews.com/why-was-equifax-so-stupid-about-passwords/
Why Was Equifax So Stupid About Passwords?
Tumblr media
Data Breach , Data Loss , Encryption & Key Management
Massive Credit Bureau Stored Users’ Plaintext Passwords in Testing Environment Mathew J. Schwartz (euroinfosec) • September 24, 2018    
Tumblr media
Excerpt from the U.K. Information Commissioner Office’s monetary penalty notice against Equifax
Massive, well-resourced companies are still using live customer data in testing environments, violating not just good development practices but also privacy laws.
See Also: Live Webinar | BSIMM: How To Assess Your Software Security Initiative
That’s yet another security failure takeaway from last year’s massive Equifax breach.
“Equifax suggested that informing data subjects that their passwords would be stored in plaintext form would have created a security risk.” 
Compared to Equifax’s poor information security and patch management practices, which led to the loss of personally identifiable information for at least 145.5 million U.S. consumers, 15.2 million U.K. consumers and 8,000 Canadian consumers, using live data in a testing environment might seem to be a footnote. But it’s not.
Indeed, Equifax managed to compound the severity of its breach by also storing plaintext copies of users’ passwords in a plaintext file, when its own cryptographic standards stated that passwords should only ever be stored in encrypted, hashed, masked, tokenized or other approved formats.
Credit Bureau Profits From Consumer Data
Equifax isn’t the first corporate giant that failed to properly secure consumer data (see Why Are We So Stupid About Passwords? Yahoo Edition).
But the security failure at Equifax is especially egregious, given that it generates massive profits from buying, sharing and selling personal information – often without individuals’ knowledge – but still failed to have the right resources in place to ensure that it was also securing this sensitive information.
Some privacy and security experts point out that many consumers would never have known that the company was acquiring, selling or storing their personal details.
“Equifax Ltd. showed a serious disregard for their customers and the personal information entrusted to them,” U.K. Information Commissioner Elizabeth Denham said last week. “Many of the people affected would not have been aware the company held their data; learning about the cyberattack would have been unexpected and is likely to have caused particular distress.”
Denham helms the Information Commissioner’s Office, which is the U.K.’s data protection authority responsible for enforcing the country’s privacy laws.
Password Security Failures
In the 32-page monetary penalty notice (PDF) issued against Equifax last week, the ICO cites a long list of failures at Equifax that contributed to the breach. Those failures include Equifax creating a “GCS dataset” – for Global Consumer Services – that attackers compromised, which contained 14,961 U.K. “data subjects’ name, address, date of birth, username, password (in plaintext), secret question and answer (in plaintext), credit card number (obscured) and some payment amounts.”
The ICO notes that the compromised data was being stored in a plaintext file labeled as being the “Standard Fraud Daily” report, which Equifax said was designed to be a “snapshot in time” of the GCS data.
“The file was held in a fileshare, which was accessible by multiple users – including system administrators and middleware technicians – for the purposes of maintenance and/or the release of application code. The file contained ‘live’ data taken from the GCS dataset which was created for testing purposes, with the intention of eventually sending it to Equifax Ltd.’s Fraud Investigations Team in the U.K.,” the ICO says. “Equifax Ltd has stated that the file was used in order to perform password analysis for the purposes of fraud prevention.”
But the ICO said this was not a valid reason for Equifax having failed to secure the data. “The commissioner has seen no adequate evidence or explanation indicating that this was a valid reason for this data not being processed in accordance with Equifax’s data handling and cryptography standards, particularly given the existence of several other fraud prevention techniques in use at the time, none of which required personal data to be stored in plaintext form,” the ICO says.
The privacy watchdog also notes “that Equifax has subsequently ceased the practice of storing passwords in plaintext whilst still being able to achieve its fraud prevention aims.”
Tumblr media
Excerpt from the ICO’s monetary penalty notice against Equifax
Channel ‘Lorem Ipsum’
In this day and age, there is no excuse for developers to be using live data in testing environments.
Substituting fake but lookalike data isn’t a new concept. Arguably, it dates from the heady “greeking” days of the 1500s, when printers and typesetters began using “lorem ipsum” – nonsensical Latin – as placeholder text.
Enter the digital age: Developers need to ensure that when users enter a value into a 16-digit credit card field, for example, their application handles it correctly. But playing with live data in production environments increases the risk that insiders or outsiders who shouldn’t be seeing the data might have access to it.
That’s why numerous development tools offer the ability to obfuscate and mask live data, as well as to generate “good enough” test data that developers can use instead.
European IT market researcher Bloor Research notes that such tools are available from a variety of vendors, including CA, Compuware, Dataprof, Dataguise, Delphix, HPE, IBM, Imperva Camouflage, IMS Privacy Analytics, Informatica, Mentis, Net 2000, Protegrity and Solix.
Equifax Failed to Obtain Consent
Equifax compounded its data security and privacy failures by not only storing plaintext passwords and security questions and answers in a plaintext file, but also not obtaining users’ consent for doing so.
Under the U.K.’s Data Protection Act, data subjects must give “specific and informed indication” of the ways in which they will allow their data to be processed.
The ICO asked Equifax why it had failed to obtain consent from users to store their plaintext passwords and security questions and other data in a plaintext file.
“Equifax suggested that informing data subjects that their passwords would be stored in plaintext form would have created a security risk,” the ICO says. “The commissioner’s view is that this type of processing activity was an inappropriate security risk, particularly given the state of the art and costs of implementation as regards appropriate technical measures to protect personal data, the resources available to an organization of Equifax’s size, and the nature of the processing it undertook.”
UK Hits Equifax With Maximum Fine
Earlier this month, the U.S. Government Accountability Office issued a report into the Equifax breach that described five key factors that contributed to the breach (see Postmortem: Multiple Failures Behind the Equifax Breach).
The ICO’s penalty notice cites some of these same failures, including Equifax having failed to renew a digital certificate for more than a year, which left one of its network scanning tools unable to scan encrypted traffic for signs of malicious activity. That turned out to be how attackers exfiltrated stolen data from Equifax starting in May 2017, as Equifax discovered in July 2017 after it renewed the certificate and the tool began working again.
Taking these and other failures into account, the ICO last week imposed the maximum possible fine on Equifax.
Luckily for Equifax, its breach occurred before the May 25 start of enforcement for the EU’s General Data Protection Regulation.
Organizations that fail to comply with GDPR’s privacy requirements face fines of up 4 percent of their annual global revenue or €20 million ($23 million), whichever is greater. Organizations that fail to comply with GDPR’s reporting requirements also face a separate fine of up to €10 million ($12 million) or 2 percent of annual global revenue (see GDPR Effect: Data Protection Complaints Spike).
Under the previous data protection laws, however, the maximum – and levied – fine facing Equifax was just £500,000 ($660,000), or 0.02 percent of the company’s 2017 annual global revenue of $3.4 billion.
Few Repercussions in U.S.
While Europe continues to crack down on companies that fail to properly secure private data, many information security experts say the U.S. lags.
Information assurance trainer William Hugh Murray says big credit bureaus such as Equifax should be held to a higher standard of security, given the types of PII they handle.
Congress, however, has declined to impose any such controls, leaving the distinct impression that firms such as Equifax are not only too big to fail, but too big to bother regulating (see Cynic’s Guide to the Equifax Breach: Nothing Will Change).
“One should not be surprised by this [Equifax] breach scenario,” Murray says. “Few breaches are rooted in a single failure. However, these were all failures of essential practices, ones that would be expected of any business, much less one that deals in purloined data about all citizens.”
Source: https://www.bankinfosecurity.com/blogs/was-equifax-so-stupid-about-passwords-p-2666
0 notes
pressography-blog1 · 8 years ago
Text
Beneficial(?) coding pointers from the CIA’s faculty of hacks
New Post has been published on https://pressography.org/beneficial-coding-pointers-from-the-cias-faculty-of-hacks/
Beneficial(?) coding pointers from the CIA’s faculty of hacks
There are lots of documents in WikiLeaks’ dump of records from the Primary Intelligence Enterprise’s Engineering Improvement Organization (EDG).                                                          
Many of the files in the dump are unclassified—manuals provided by means of Lockheed Martin and different vendors, for instance. Most are labeled with the name of the game stage, along with matters as harmless as a guide to getting started with Microsoft Visible Studio, reputedly the desired Development tool of the EDG’s Applied Engineering Branch (AED). There may be additionally a smattering of meme construction additives and lively GIFs of the anime series Trigun.
But a tiny fraction of the facts is relatively classified, consistent with file marks. This cache sits on the Top Mystery degree, and it’s marked as “Special Intelligence” (SI) and “NOFORN” (no foreign distribution). Out of the first batch of just over 1,000 files, there are two paragraphs marked at that stage. And those pieces describe minutiae of ways the CIA’s Community Operations Department wishes the cryptographic functions of its equipment to paintings and how the CIA obtains and prepares telephones to be used in its make the most lab.
So for the Most part, the damage achieved with the aid of the files is not what they reveal approximately the CIA’s hacking and Community espionage talents. Alternatively, the hassle is the volume to which these leaked documents reveal the technical specs, practices, and different information of the CIA’s inner hacking device Development groups. Now, anyone getting access to the documents can recognize how the EDG used elements taken from malware located inside the wild to build their personal and what the CIA defines because of the “dos and don’ts” for growing assault and espionage gear. In different phrases, a good deal of the tradecraft of the CIA’s internal hacking groups has been pulled from their collaboration server.
However, a lot of that tradecraft looks as if Malware a hundred and one upon inspection. In reality, a number of the comments left by CIA developers in 2013 pointed out how dated the practices have been. Lots of those techniques do not qualify as Secret.
To illustrate this, we’ve got annotated a few excerpts from the AED builders’ malware-writing knowledge. A bargain of these guidelines might practice to absolutely everyone writing a protection-centered application. a lot of the high-quality practices centered on anti-forensics—making it greater hard for the adversary’s information safety teams to come across and decipher precisely what turned into going on with malware. And some of the chestnuts on overall coding practices consist of:
I. Do not leave a calling card
AED’s developers had been warned against doing matters in developing gear that might make it less complicated for an adversary to parent out wherein the device, implant, or malware they developed had come from.
“DO now not depart dates/instances such as bring together timestamps, linker timestamps, construct times, get entry to times, and many others. That correlate to well known US middle operating hours (i.E. 8am-6pm Japanese time).” Such artifacts have often been utilized by analysts as a part of the process of attributing malware to Russian authors, for example.
AED developers were informed to use UTC time all the time-dependent operations in code as properly. This ensures that they did continuously and didn’t surrender any precise time area bias.
“DO strip all debug image data, manifests [left by Microsoft Visual C++], build paths, [and] developer usernames from the final build of a binary.” The one’s forms of matters could be utilized in attribution as properly. For comparable motives, the document exhorts builders to no longer “go away facts in a binary report that demonstrates CIA, USG, or its witting partner businesses’ involvement in the introduction or use of the binary/device.”
Simple Hacking Tricks
Then There’s the primary operational security admonition: “DO no longer have records that consist of CIA and USG cowl phrases, compartments, operation code names or other CIA and USG particular terminology within the binary.”
There’s an extra caution about every other item now not to encompass in tools—horrific language. “DO now not have ‘grimy words’ within the binary. dirty words, which include hacker phrases, might also cause unwarranted scrutiny of the binary file in question.”
II. Do not break the goal’s laptop
AED builders have been subsequent warned against rookie mistakes that might make it less complicated to opposite-engineer gear. the first rule of Malware Membership changed into not to make the goal’s device unusable, therefore drawing undesirable interest to the malware’s presence.
“DO now not perform operations so as to reason the target laptop to be unresponsive to the user (e.G. CPU spikes, display screen flashes, screen ‘freezing’, and many others.,” the file warns.
“DO no longer perform Disk I/O operations with a view to purpose the machine to end up unresponsive to the person or alerting to a gadget Administrator.” The last thing you want is for a person to an appearance in a device screen and see something referred to as Notepad.Exe ingesting all of a system’s CPU, Community, and disk I/O cycles. “DO have a configurable most length restrict and/or output report is counted for writing… output files.” This prevents series jobs with the aid of a device from filling up the disk storage of the goal, as an instance. That prevalence could in all likelihood motive a help visit that would reveal the device’s presence.
In a similar vein, the record instructions, “DO no longer generate crash dump documents, core dump files, ‘Blue’ screens, Dr. Watson or other dialog pop-u.S.and/or other artifacts in the occasion of a program crash.” Mistakes codes work both methods: they may be useful in forensics in addition to debugging. AED’s developers are directed to force their code to crash in the course of testing to affirm that it won’t deliver itself up.
III. Use a few encryption, dude
any other part of keeping a low profile is encrypting information used by the device—in memory, on disk, and over the Community. One of the files linked blanketed the subsequent tips:
“DO obfuscate or encrypt all strings and configuration data that at once relate to tool functionality,” due to the fact somebody searching at a configuration file or an executable in a textual content editor or hex editor have to not be capable of discern out what your device is doing and wherein it’s miles sending matters simply from textual content embedded inside the code. The report calls on AED builders to put in writing code that simplest decrypts facts because it’s wished and to without delay dispose of the unencrypted records from memory as soon because it’s now not wished. “DO no longer Depend on THE Running gadget TO Try this UPON TERMINATION OF EXECUTION.” “DO no longer write undeniable-text series data to disk,” due to the fact that could get awkward rapid. “DO encrypt all facts written to disk,” and “DO utilize a cozy erase [overwriting the file with zeros at least once] whilst casting off a record from disk.” That manner, There is not anything left inside the trash can to retrieve. “DO use give up-to-give up encryption for all Community communications”—because the passive collection of unencrypted statistics leaving the Network could destroy the operation’s day. Use trendy Net protocols for “blending” communications with the relaxation of the target’s Community traffic—no longer some custom protocol seeking to faux its manner via as something else. awful protocols will display up as broken site visitors in a Community screen like Wireshark, which could draw interest. “DO now not solely Rely on SSL/TLS to relaxed statistics in transit”—due to the fact SSL proxies might be able to guy-in-the-middle the Community session and decrypt it. That is a lesson that even a few comfortable messaging applications have found out the tough manner.
“DO use variable size and timing (aka jitter) of beacons/Network communications. DO no longer predicatively ship packets with a set size and timing. DO right cleanup of Community connections. DO now not go away around stale Network connections.” In brief, varying the size and time of communications returned to a command and manipulate server will make the packets you ship over the Community much less of an announcement of your tool’s presence. IV. Don’t make the job of the adversary’s forensics crew any easier
A lazy espionage device coder is an unsuccessful espionage tool coder. The “DOs and DON’Ts” manual calls out some of coding hygiene guidelines geared toward stopping sloppiness from exposing CIA operations:
“DO strip all debugging output…From the very last build of a tool”—because There’s nothing like leaving a bit instrumentation at the back of to help a person else figure out what the device is for. “DO not explicitly import/name capabilities that aren’t always steady with a tool’s overt capability.” In different words, if you’re disguising an implant as notepad.Exe, Do not have it call Windows approaches that notepad.Exe would not call—it’ll improve suspicion and make it less complicated for a person to figure out what your device is, in reality, doing thru static evaluation. “DO now not export sensitive characteristic names; if having exports are required for the binary, utilize an ordinal or a benign characteristic call.” because having a line of code like “__declspec( DllImport ) void DoVeryBadThings()” might draw the attention of an analyst.
“DO not study, write and/or cache facts to disk unnecessarily.” Writing too much to disk makes the forensic footprint of a tool more apparent.
Cia Wiki
Maintain it small: “DO make all reasonable efforts to reduce binary record length for all binaries in an effort to be uploaded to a far-flung target (without the usage of packers or compression). Best binary record sizes should be under 150KB for a totally featured tool.”
“DO now not allow Community site visitors, together with C2 packets, to be re-playable.” That means communications between the tool and the command and manipulate server going for walks it should be time-and-date sensitive in order that the adversary cannot record the traffic and ship it again at the tool in an attempt to opposite-engineer what it’s doing.
0 notes