#i keep seeing people analyzing all these things i’ve been compiling and feeling extremely validated in my pattern noticing fjshshd
Explore tagged Tumblr posts
Text
working on like 17 different motif compilation posts rn but half of them tie into each other and i can’t even begin to try analyzing what it all means….. i need to gather and sort them all like skittles first
#always hoping someone else might find something useful in them though#i keep seeing people analyzing all these things i’ve been compiling and feeling extremely validated in my pattern noticing fjshshd#working on something with duplication atm times and copies etc#but i keep getting through the first two seasons and then re-fixating on something else so s3-4 remain unexamined#should i post the first parts of some now/when I finish or wait until I’ve gone through the whole series?#I’ve already posted a couple that I’m not through with but I can never seem to finish anything I say I’ll get back to later anyway fkdj idk#i know they’re not as interesting without analysis but I need to be able to look at the entire scope before I can analyze some of this stuff#and there are others way more competent w analysis so maybe I should just leave it to them and just keep doing my groundwork#i’m just gonna keep making my lists bc that’s what I’m enjoying rn
6 notes
·
View notes
Text
The Answers in Death
Reflection
Growing up I’ve always watched and continue to watch crime scene based television shows like CSI: Miami and Forensic Files. Even though CSI: Miami is scripted and Forensic Files is real life cases, these shows really captured my interest from a young age and I’ve always been interested to learn more about the work being performed in these shows. The science and technology aspect of both shows really grabbed my attention and peaked my curiosity. This is where my love for forensics began and I knew this was the field that I wanted to pursue in the future. I would like to be a forensic pathologist for my future career. This is where a trained professional performs autopsies on the bodies of victims to determine the cause of death. If I don’t choose this exact career, then I want to work in a forensics lab, which would involve something like DNA analysis or toxicology. While determining what I want to do for a career, I answered the questions in Scott Christ’s article titled, “7 Powerful Questions to Find Out What You want to Do with your Life” to help me get a better understanding of myself and how my goals and passions will drive my future career.
My future career has to involve forensics because this is something I am very interested and passionate about. Preferably a lab setting involving forensics would be an ideal career for me. I enjoy and have a great passion in performing experiments and using technologically advanced equipment to solve scientific challenges for both myself and for others. As for others, I mean that I would like to work in a lab where me and my colleagues work to achieve a greater goal. By a greater goal, I mean something beyond myself, for example, to help victims’ families get answers by providing them with forensic data as to what happened to their loved ones. I value helping others in their desperate times and even doing behind the scenes work in a lab can still contribute to the overall end goal, even if nobody notices the hard work that is being done.
I feel that a lab setting would be fitting for me because I do not like to work in groups or in a loud atmosphere. I feel that I work better in a quiet setting because I am able to think peacefully and concentrate. I also feel that I work better independently, but I would like some assistance and ideas from other people when things may become unclear. A lab setting would be perfect for these reasons because I would be working with minimal people, in which we would have to bounce ideas off each other to complete the interweaving tasks at hand and since there would be minimal people in a lab, there would not be much noise so we could all concentrate on what we have to do. I take my work very seriously and work very hard to complete my tasks and goals.
I value working hard to achieve my dreams because if you don’t work hard, you won’t get very far in life, in my opinion. My dad has instilled this value in me since a young age. He is a very determined and extremely hard working person. These are the values that I want to keep working towards and improve on in my life as well as with a future career. I am highly determined to reach my goals and dreams in life and to accomplish those dreams, I have to work hard. I already work extremely hard in college and would have to continue to work hard because the career I want to pursue is relatively small and competitive. In order to discover the reality of my future career, I went to the Occupational Outlook Handbook to see what it was all about.
The Reality of the Job
According to the Occupational Outlook Handbook entry “Forensic Science Technicians,” the profession has a wide variety of duties and specializations available, with more schooling needed, and is growing much faster than average. Forensic science technicians compile and evaluate evidence from crime scenes to assist in criminal investigations. They work both in labs and at crime scenes. At crime scenes, they determine what evidence needs to be collected, take photographs of the crime scene, and record observations, among many other tasks. In the lab, they perform tests on the evidence collected and consult with other specialists, as well as other tasks. Forensic science technicians can be either specialized in a particular area or unspecialized.
Forensic science technicians without any specialties can perform the tasks listed above, but there are different specialty branches in the forensic science technician field. These branches include forensic biologists/chemists, and forensic computer examiners. Forensic biologists/chemists use laboratory equipment to analyze evidence collected. Forensic computer examiners gather and interpret data based on computer and internet crimes. All branches of forensic science technicians have to write reports and be able to explain them, which can then be used in court. Since there are so many specialties available, the pay and job outlook are somewhat higher than average. With all the different tasks that need to be completed in these specialties by the different branches, a forensic science technician’s work week can vary.
Many forensic science technicians have to work in a variety of weather conditions and may have staggered shifts, often working many hours in the laboratories. Some may even work a standard work week, but may be on call when need be. To accompany such a schedule, deep knowledge and understanding of the science must be obtained through schooling.
To become a forensic science technician, a Bachelor’s degree in a natural science is typically needed, but a Master’s degree is recommended. More schooling will be needed to have a specialty in a certain branch of forensics. Forensic science technicians may need to pass certain exams or be approved by accreditation before they are allowed to perform independently. They receive on the job training in both collecting the correct evidence properly and specialties in the laboratory. Licenses and certifications are not necessarily needed to enter into the occupation, but they help build credentials, which vary in each jurisdiction.
Even though much schooling and knowledge is needed, there are important qualities as a person that are needed as well. The important qualities to have for becoming a forensic science technician are communication skills, problem solving skills, and being detail oriented. To learn more information first hand from someone who is currently in the field, I consulted Katherine Brown, whose name has been changed to maintain confidentiality. She works as a forensic toxicologist for the government.
On the Job Interview
Katherine was introduced to the idea of forensics in a middle school science class and loved it ever since. She was amazed at the different ways that math and science can be used in forensics. “My teacher gave us an assignment of measuring different body parts to be able to estimate the height and weight of a person. I found this idea fascinating,” Katherine exclaimed. From there, she went on to college to pursue forensics, where she received both her Bachelor’s and then her Master’s degree in forensic toxicology. She specified that most laboratories require a Bachelor’s degree in any science, such as biology, chemistry, or, of course, forensic science and that some agencies prefer a more advanced degree, which is why she obtained her Master’s degree and eventually this schooling landed her a job in the government.
I was curious as to what her job as a forensic toxicologist actually entails because I haven’t done much research in that particular area of the field. “I analyze biological specimens for the presence of alcohol and drugs. I write up reports to my findings and testify when needed,” Katherine stated. She also trains in new test methods and stays up to date on the literature of the current test methods on which she performs analysis. She reviews other experts’ reports in different training methods and finds ways to validate and/or improve them, as well. I found this all very interesting because I discovered that her job consists of a lot more tasks than I ever realized and they all sound like something that I would like to do in a future career.
Even though there are many tasks to be completed, she mentions that she works a regular 40 hour workweek and that everyday varies, which really intrigues me because I don’t want to be doing the same thing everyday. Speaking of variety, she stated that she likes the variety in her job. She even gets to travel for her job, which I didn’t even think was possible because I thought you were just in a lab all day. “I was able to attend the American Academy of Forensic Science (AAFS) conference in California,” Katherine recounts about this proud moment. Katherine is an expert in the field and believes that there are certain skills that a person needs to be successful in this field.
Katherine suggests that people need to be “inquisitive, determined, hard-working, and honest.” These are both qualities that are explained in the Occupational Outlook Handbook and I agree with them myself from the research I have done on my future career. I believe that I possess all of these qualities, which makes me feel more confident that this is the job for me. Some of Katherine’s words really had an impact on me and reassured me to keep pursuing this dream of mine and always look for possible opportunities. “It is an ever growing field and it is what you make it. The possibilities are endless,” Katherine explained to me.
Even with many possibilities available, there are issues in each branch of forensics that are important and considerable when discussing the field. Katherine mentions two major issues in her field of toxicology and she agrees that it is important to stay on top of current information no matter what field you are in. First, she states that the use of novel psychoactive substances (NPS), or designer drugs, are hard to detect in the body because they are relatively new and the technology is not fully up to date to be able to identify these substances clearly. Second, she states how it is a growing concern that there are more wrongful convictions due to forensic evidence. My conversation with Katherine made me realize that there are important issues in the field now that we need to be aware of. One of those issues is wrongful convictions due to overstated evidence. I found an article about just this topic in The New York Times.
The Career in the News
According to The New York Times article titled, “A Leading Cause for Wrongful Convictions: Experts Overstating Forensic Results,” Heather Murphy discusses why charges against United States citizens are dismissed based on the fact that forensic experts are overstating and over exaggerating scientific results by manipulating certain forensic testing techniques. Two major culprits that are the leading causes of these wrongful convictions are official misconduct and misleading forensic evidence. Forensic experts are legally competent to take the stand in court to present to their findings, but Murphy reveals that when they finally meet the criteria to present in the courtroom, there are very little restrictions on what they actually say, so the line between truth and exaggeration is very blurred. She reveals that because there are very few restrictions, experts can make up odds to sound more believable or convincing, since forensic experts are looked upon as credible people. Murphy outlines three different forensic analysis techniques that the experts can manipulate and provides case examples with each technique.
The first technique that Murphy discusses is microscopic hair comparison. She uncovers that the F.B.I. reported that microscopic hair comparison is invalid to produce a match between two hair samples. She provides the case example of Glenn Payne, who was charged with and arrested for sexual abuse of a child. A lab analyst testified in court confirming that a hair found on the victim matched the hair of the abuser, accompanied by an unreasonable odd that the hair was actually the abuser’s hair. Years later, lawyers re-investigated the case and the analyst confessed that his evidence was untrue. This case demonstrates how an expert manipulated the odds of hair sample analysis, when he was unsure of whose hair it was himself, which unfortunately led to a wrongful conviction.
The second technique that Murphy cites is bite mark matching. She cites that bite mark matching is even harder to compare than hair samples. She provides the case example of Steven Chaney, who was charged with and arrested for a murder of a couple that sold him drugs. A medical consultant testified in court confirming that a model of the defendant’s bite mark matched the bite marks on one of the victim’s bodies. Again, this expert used an irrational odd. Murphy cites that this is done because the expert is just stating that the defendant is the source, when they really are unsure, or that the defendant is virtually impossible to not be the source. This case demonstrates how an expert was ultimately unsure of the bite mark evidence and manipulated the odds to sound more believable, which again led to a wrongful conviction.
The third technique that Murphy discusses is touch DNA amplification. She explains that even though DNA evidence analysis is more appreciated and respected than the older testing techniques, there are still ways that the odds can be manipulated by the expert presenting them. She provides the case example of Mayer Herskovic, who was charged with and arrested for assault. The shoe of the victim was launched onto a roof by the alleged attacker and collected for DNA analysis of the attacker. The genetic sample turned out to be too small to use for analysis, but the expert presenting the evidence claimed that he developed and could use software to amplify small DNA samples. The expert testified in court confirming an unreasonable odd that the DNA sample on the victim’s shoe was in fact the attacker’s DNA. This led to a wrongful conviction, based on unreasonable odds, but it was later proven that the expert who testified highly exaggerated his new analysis technique and the expert’s office deserted the technique after the defendant was exonerated.
These are just a few case examples of different techniques where the expert testifying with them can manipulate the odds and create wrongful convictions because of them. Since the proof that these techniques can be manipulated, other cases solely based on these types of forensic evidence have been reviewed again and the people who the courts believed were wrongly convicted, had their convictions overturned and have been released from prison. The fact that this has happened so many times and that the experts practically lied has me curious.
I am highly surprised that this has happened to so many people. I thought that these experts and their facts would be checked by another expert to validate their results before presenting, but that is not the case. The sad truth is that there is barely any limit to what the experts say, and that frightens me. Speaking upon what they can say, I am shocked that these forensic experts actually do over exaggerate the truth about the evidence that they are presenting. Forensic experts are thought to be very credible because they are one of the very few experts allowed to testify in court with their evidence and most people probably believe that they are credible if the courts are allowing them to take the stand. It is a surprise to me that they would overstate information and lie in a sense when they are believed to be so credible.
I chose this topic because it is related to my potential future career field in the fact that it discusses the use of forensic evidence to convict people of crimes, which is something that I want to potentially do later in life. This topic is really interesting to me because of the amount of times it happened and that credible people would present skewed evidence. This article has taught me a lot about what can really happen in the field.
I learned that even though some forensic techniques have been around for a while and have been trusted and accepted by the general public, the information that comes from them is not very reliable and in turn, the information from these techniques can be manipulated by the experts, which can lead to innocent people being incarcerated. I learned that there are very few limitations as to what the experts can say, which also allows for manipulation of results. I still need to learn how reliable these techniques are in general for being used in crime scenes and how the experts determine the criteria for each technique on a case to case basis. Ultimately, I would like to learn how the experts decide to incarcerate people based on what evidence they have and why they feel the need to lie if their evidence isn’t enough.
I am highly interested in this career field and can’t wait to take more classes on the subject and get more hands on in the lab on the equipment. I’ve learned about how and what you need to succeed in the field and some major issues in the field. I discovered first hand from an expert that my potential career path is a lot of work, but she reassured me that it is always interesting and worth it. I believe that I will continue to pursue this career path and even someday, I might find the answers in death.
By Natasha Cunningham
0 notes
Text
Porting a 15 year old .NET 1.1 Virtual CPU Tiny Operating System school project to .NET Core 2.0
I've had a number of great guests on the podcast lately. One topic that has come up a number of times is the "toy project." I've usually kept mine private - never putting them on GitHub - Somewhat concerned that people would judge me and my code. However, hypocrite that am (aren't we all?) I have advocated that others put their "Garage Sale Code" online. So here's some crappy code. ;)
The Preamble
While I've been working as an engineer for 25 years this year, I didn't graduate from school with a 4 year degree until 2003 - I just needed to get it done, for myself. I was poking around recently and found my project from OIT's CST352 "Operating Systems" class. One of the projects was to create a "Virtual CPU and OS." This is kind of a thought exercise. It's not really a parser/lexer - although there is both - and it's not a real OS. But it needs to be able to take in a made-up quasi-Assembly Language instruction set and execute them on a virtual CPU while managing virtual memory of arbitrary size. Again, a thought exercise made real to confirm that the student understands the responsibilities of a CPU.
Here's an example "application." Confused yet? Here's the original spec I was given in 2002 that includes the 36 instructions the "CPU" should understand. It has 10 general-purpose 32bit registers address as 1 through 10. Register 10 is the stack pointer. There are two bit flag registers - sign flag and zero flag.
Instructions are "opcode arg1 arg2" with constants prefixed with "$."
11 r8 ;Print r8 6 r1 $10 ;Move 10 into r1 6 r2 $6 ;Move 6 into r2 6 r3 $25 ;Move 25 into r3 23 r1 ;Acquire lock in r1 (currently 10) 11 r3 ;Print r3 (currently 25) 24 r1 ;Release r4 (currently 10) 25 r3 ;Sleep r3 (currently 25) 11 r3 ;Print r3 (currently 25) 27 ;Exit
I write my homework assignment in 2002 in the idiomatic C# of the time on .NET 1.1. That means no Generics<T> - I had to make my own strongly typed collections. That means C# has dozens of (if not a hundred) language and syntax improvements. I didn't use a Unit Testing Framework as TDD was just starting around 1999 during the XP (eXtreme Programming) days and NUnit was just getting start. It also uses "unsafe" to pin down memory in a few places. I'm sure there are WAY WAY WAY better and more sophisticated ways to do this today in idiomatic C# of 2017. Those are excuses, the real reasons are my own ignorance, ability, combined with some night-school laziness.
One of the more fun parts of this exercise was moving from physical memory (a byte array as I recall) to a full-on Memory Manager where each Process thought it could address a whole bunch of Virtual Memory while actual Physical Memory was arbitrarily sized. Then - as a joke - I would swap out memory pages as XML! ;) Yes, to be clear, it was a joke and I still love it.
You can run an "app" by passing in the total physical memory along with the text file containing the program, but you can also run an arbitrary number of programs by passing in an arbitrary number of text files! The "TinyOS" will handle each process thinking it has its own memory and will time
If you are more of a visual learner, perhaps you'd prefer this 20-slide PowerPoint on this Tiny CPU that I presented in Malaysia later that year. You dig those early 2000-era slides? I KNOW YOU DO.
Updating a .NET 1.1 app to cross-platform .NET Core 2.0
Step 1 was to download the original code from my own blog. ;) This is also Reason #4134 why you should have a blog.
I decided to use Visual Studio 2017 to upgrade it, and even worse I decided to use .NET Core 2.0 which is currently in Preview. I wanted to use .NET Core 2.0 not just because it's cross-platform but also because it promises to have a pretty large API surface area and I want this to "just work." The part about getting my old application running on Linux is going to be awesome, though.
Visual Studio then pops a scary dialog about upgrading files. NOTE that another totally valid way to do this (that I will end up doing later in this blog post) is to just make a new project and move the source files into it. Natch.
Visual Studio says it's targeting .NET 2.0 Full Framework, but I ratchet it up to 4.6 to see what happens. It builds but with a bunch of errors about Obsolete methods, the most interesting one being this one:
Warning CS0618 'ConfigurationSettings.AppSettings' is obsolete: 'This method is obsolete, it has been replaced by System.Configuration!System.Configuration.ConfigurationManager.AppSettings' C:\Users\scott\Downloads\TinyOSOLDOLD\OS Project\CPU.cs 72
That's telling me that my .NET 1/2 API will work but has been replaced in .NET 4.x, but I'm more interested in .NET Core 2.0. I could make my EXE a LIB and target .NET Standard 2.0 or I could make a .NET Core 2.0 app and perhaps get a few more APIs. I didn't do a formal analysis with the .NET Portability Analyzer but I will add that to the list of Things To Do. I may be able to make a library that works on an iPhone - a product that didn't exist when I started this assignment. That would be Just Cool(tm).
I decided to just make a new empty .NET Core 2.0 app and copy the source .cs files into it. A few interesting things.
My app also used "unsafe" code (it pins memory down and accesses it directly).
It has extensive inline documentation in comments that I used to use NDoc to make a CHM Help file. I'd like that doc to turn into HTML at some point.
It also has an appsettings.json file that needs to get copied to the output folder when it compiles.
While I could publish it to a self-contained .NET Core exe, for now I'm running it like this in my test batch files - example:
dotnet netcoreapp2.0/TinyOSCore.dll 512 scott13.txt
Here's the resulting csproj file.
<Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <OutputType>Exe</OutputType> <TargetFramework>netcoreapp2.0</TargetFramework> <GenerateDocumentationFile>true</GenerateDocumentationFile> </PropertyGroup> <PropertyGroup> <AllowUnsafeBlocks>true</AllowUnsafeBlocks> </PropertyGroup> <ItemGroup> <None Remove="appsettings.json" /> </ItemGroup> <ItemGroup> <Content Include="appsettings.json"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> </Content> </ItemGroup> <ItemGroup> <PackageReference Include="Microsoft.Extensions.Configuration" Version="2.0.0-preview2-final" /> <PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="2.0.0-preview2-final" /> <PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="2.0.0-preview2-final" /> <PackageReference Include="Microsoft.Extensions.Options.ConfigurationExtensions" Version="2.0.0-preview2-final" /> </ItemGroup> </Project>
Configuration is even more different on .NET Core 2.0. This little TinyOS has a bunch of config options that come in from a .exe.config file in XML like this (truncated):
<configuration> <appSettings> <!-- Must be a factor of 4 This is the total Physical Memory in bytes that the CPU can address. This should not be confused with the amount of total or addressable memory that is passed in on the command line. --> <add key="PhysicalMemory" value="128" /> <!-- Must be a factor of 4 This is the ammount of memory in bytes each process is allocated Therefore, if this is 256 and you want to load 4 processes into the OS, you'll need to pass a number > 1024 as the total ammount of addressable memory on the command line. --> <add key="ProcessMemory" value="384" /> <add key="DumpPhysicalMemory" value="true" /> <add key="DumpInstruction" value="true" /> <add key="DumpRegisters" value="true" /> <add key="DumpProgram" value="true" /> <add key="DumpContextSwitch" value="true" /> <add key="PauseOnExit" value="false" />
I have a few choices. I could make a Configuration Provider and reach .NET Core to read this format (there's an XML adapter, in fact) or make the code porting easier by moving these "name/value" pairs to a JSON file like this:
{ "PhysicalMemory": "128", "ProcessMemory": "384", "DumpPhysicalMemory": "true", "DumpInstruction": "true", "DumpRegisters": "true", "DumpProgram": "true", "DumpContextSwitch": "true", "PauseOnExit": "false", "SharedMemoryRegionSize": "16", "NumOfSharedMemoryRegions": "4", "MemoryPageSize": "16", "StackSize": "16", "DataSize": "16" }
This was just a few minutes of search and replace to change the XML to JSON. I could have also written a little app or shell script. By changing the config (rather than writing an adapter) I could then keep the code 99% the same.
My code was doing things like this (all over...there was no DI container yet):
bytesOfPhysicalMemory = uint.Parse(ConfigurationSettings.AppSettings["PhysicalMemory"]);
And I'd like to avoid major refactoring - yet. I added this bit of .NET Core configuration at the top of the EntryPoint and saved away an IConfigurationHost:
var builder = new ConfigurationBuilder() .AddJsonFile("appsettings.json"); Configuration = builder.Build();
I've got a Dictionary in the format of the IConfiguration host called "Configuration." So now I just do this in a dozen places and the app compiles again:
bytesOfPhysicalMemory = uint.Parse(Configuration["PhysicalMemory"]);
This brings up that feeling we all have when we look at old code - especially our own old code. I should have abstracted that away! Why didn't I use an interface? Why so many statics? What was I thinking?
We can beat ourselves up or we can feel good about ourselves and remember this. The app worked. It still works. There is value in it. I learned a lot. I'm a better programmer now. I don't know how far I'll take this old code but I had a lovely afternoon porting it to .NET Core 2.0 and I may refactor the heck out if it or I may not.
For now I did update the smoke tests to run on both Windows and Linux and I'm happy with the experiment.
Related Links
Download PPT Slides on the Tiny OS presented at TechEd Malaysia 2002
Andy Clarke from New Zealand took the original spec and did the homework assignment in 2012! His project - 5 years ago, and 10 years after mine - includes some interesting changes. Rather than an EXE that takes in the programs from the command line, he's written over 221 NUnit 2 tests that check each individual component he's written as well as more comprehensive integration tests (as unit test) for the programs. The "Assembly" language has been changed from opcodes to more human readable commands like "move" and "add." I think Andy's solution is much nicer than mine, but he wouldn't pass the class because the spec was pretty clear and my teacher was a stickler. ;) I LOVE that someone else did this on their own!
PODCAST: YOU should write an interpreter with Thorsten Ball
Have YOU done a project like this, either in school or on your own?
Sponsor: Check out JetBrains Rider: a new cross-platform .NET IDE. Edit, refactor, test, build and debug ASP.NET, .NET Framework, .NET Core, or Unity applications. Learn more and get access to early builds!
© 2017 Scott Hanselman. All rights reserved.
0 notes