Tumgik
#also i started chipping away at my self worth issues thanks to about 2 dozen ot3 fics so shout-out for that too
riverofrainbows · 1 month
Text
I worked my way through the first few pages of the eliot spencer tag in the leverage fandom on ao3, and left a whole bunch of comments, and now i get to see all the replies coming in!!!! This is so great!!
Of course not everyone replies, and i don't expect them to even though i hope for it, and a lot of these fics are about a decade old so maybe the authors aren't even in the fandom anymore.
But it's so lovely.
I didn't manage to leave a comment on every one because i was reading those fics like a drowning man trying to breathe, so both when i was feeling well and also when i wasn't, but i tried, and when i managed to i left long ones because that's what I enjoy most to do, and i always left a kudo which i hope conveys love for the story as well.
I appreciate so much how there are so many stories i get to read, i still have a lot of pages left too.
This is so wonderful.
26 notes · View notes
gta-5-cheats · 6 years
Text
Inside Atari’s rise and fall
New Post has been published on http://secondcovers.com/inside-ataris-rise-and-fall/
Inside Atari’s rise and fall
(adsbygoogle = window.adsbygoogle || []).push();
.eyvmj5b2beea7ca2f9 margin: 5px; padding: 0px; @media screen and (min-width: 1201px) .eyvmj5b2beea7ca2f9 display: block; @media screen and (min-width: 993px) and (max-width: 1200px) .eyvmj5b2beea7ca2f9 display: block; @media screen and (min-width: 769px) and (max-width: 992px) .eyvmj5b2beea7ca2f9 display: block; @media screen and (min-width: 768px) and (max-width: 768px) .eyvmj5b2beea7ca2f9 display: block; @media screen and (max-width: 767px) .eyvmj5b2beea7ca2f9 display: block;
Jamie Lendino Contributor
By the first few months of 1982, it had become more common to see electronics stores, toy stores, and discount variety stops selling 2600 games. This was before Electronics Boutique, Software Etc., and later, GameStop . Mostly you bought games at stores that sold other electronic products, like Sears or Consumer Distributors. Toys ’R’ Us was a big seller of 2600 games. To buy one, you had to get a piece of paper from the Atari aisle, bring it to the cashier, pay for it, and then wait at a pickup window behind the cash register lanes.
Everyone had a favorite store in their childhood; here’s a story about one of mine. A popular “destination” in south Brooklyn is Kings Plaza, a giant (for Brooklyn) two-story indoor mall with about 100 stores. My mother and grandmother were avid shoppers there. To get to the mall from our house, it was about a 10-minute car service ride. So once a week or thereabouts, we’d all go. The best part for me was when we went inside via its Avenue U entrance instead of on the Flatbush Avenue side. Don’t ask me what went into this decision each time; I assume it depended on the stores my mother wanted to go to. All I knew was the Avenue U side had this circular kiosk maybe 50 feet from the entrance. The name has faded from memory. I remember it was a kind of catch-all for things like magazines, camera film, and other random stuff.
But the most important things were the Atari cartridges. There used to be dozens of colorful Atari game boxes across the wall behind the counter. When we walked up to the cashier’s window, there was often a row of new Atari games across the top as well. Sometimes we left without a new cartridge, and sometimes I received one. But we always stopped and looked, and it was the highlight of my trip to the mall each time.
For whatever reason, I remember the guy behind the counter gave me a hard time one day. I bought one of Atari’s own cartridges—I no longer remember which, but I’m almost sure it was either Defender or Berzerk—that came with an issue of Atari Force, the DC comic book. I said I was excited to get it. The guy shot me a dirty look and said, “You’re buying a new Atari cartridge just for a comic book?” I was way too shy to argue with him, even though he was wrong and I wanted the cartridge. I don’t remember what my mother said, or if she even heard him. Being too shy to protest, I sheepishly took my game and we both walked away.
Mattel began to run into trouble with its Intellivision once the company tried to branch out from sports games. Because Mattel couldn’t license properties from Atari, Nintendo, or Sega, it instead made its own translations of popular arcade games. Many looked better than what you’d find on the 2600, but ultimately played more slowly thanks to the Intellivision’s sluggish CPU. Perhaps the most successful was Astrosmash, a kind of hybrid of Asteroids and Space Invaders, where asteroids, space ships, and other objects fell from the sky and became progressively more difficult. Somewhat less successful were games like Space Armada (a Space Invaders knock off).
Mattel also added voice synthesis—something that was all the rage at the time—to the Intellivision courtesy of an add-on expansion module called Intellivoice. But only a few key games delivered voice capability: Space Spartans, Bomb Squad, B-17 Bomber (all three were launch titles), and later, Tron: Solar Sailer. The Intellivoice’s high cost, lack of a truly irresistible game, and overall poor sound quality meant this was one thing Atari didn’t have to find a way to answer with the 2600.
These events made it easier for Atari to further pull away from Mattel in the marketplace, and it did so—but not without a tremendous self-inflicted wound. A slew of new 2600 games arrived in the first part of 1982. Many important releases came in this period and those that followed, and we’ll get to those shortly. But there was one in particular that the entire story arc of the platform balanced on, and then fractured. It was more than a turning point; its repercussions reverberated throughout the then-new game industry, and to this day it sticks out as one of the key events that ultimately did in Atari.
The single biggest image-shattering event for the 2600—and Atari itself—was the home release of its Pac-Man cartridge. I can still feel the crushing disappointment even now. So many of my friends and I looked forward to this release. We had talked about it all the time in elementary school. Pac-Man was simply the hottest thing around in the arcades, and we dreamed of playing it at home as much as we wanted. The two-year wait for Atari to release the 2600 cartridge seemed like forever. Retailers bought into the hype as well. Toy stores battled for inventory, JC Penney and Kmart bought in big along with Sears and advertised on TV, and even local drug stores started stocking the game. And yet, what we got…wasn’t right.
Just about everyone knows how Pac-Man is supposed to work, but just in case: You gobble up dots to gain points while avoiding four ghosts. Eat a power pellet, and you can turn the tables on the ghosts, chase them down, and eat them. Each time you do so, the “eyes” of the ghost fly back to the center of the screen and the ghost regenerates. Eat all the dots and power pellets on the screen, and you progress to the next one, which gets harder. Periodically, a piece of fruit appears at the center of the screen. You can eat it for bonus points, and the kind of fruit denotes the level you are on (cherry, strawberry, orange, and so on).
But that’s not the game Atari 2600 owners saw. After securing the rights to the game from Namco, Atari gave programmer Tod Frye just five weeks to complete the conversion. The company had learned from its earlier mistakes and promised Frye a royalty on every cartridge manufactured (not sold), which was an improvement. But this was another mistake. The royalty plus the rushed schedule meant Frye made money even if the game wasn’t up to snuff, and thus Frye had incentive to complete it regardless. Atari also required the game to fit into just 4KB like older 2600 cartridges, rather than the newer 8KB size that was becoming much more common by this point. That profit-driven limitation heavily influenced the way Frye approached the design of the game. To top it all off, Atari set itself up for a colossal failure by producing some 12 million cartridges, even though there were only 10 million 2600 consoles in circulation at the time. The company was confident that not only would every single existing 2600 owner buy the game, but that 2 million new customers would buy the console itself just for this cartridge.
We all know how it turned out. The instruction manual sets the tone for the differences from the arcade early on. The game is now set in “Mazeland.” You eat video wafers instead of dots. Every time you complete a board, you get an extra life. The manual says you also earn points from eating power pills, ghosts, and “vitamins.” Something is definitely amiss.
Pac-Man himself always looks to the right or left, even if he is going up or down. The video wafers are long and rectangular instead of small, square dots. Fruits don’t appear periodically at the center of the screen. Instead, you get the aforementioned vitamin, a clear placeholder for what would have been actual fruit had there been more time to get it right. The vitamin always looks the same and is always worth 100 points, instead of increasing as you clear levels. The rest of the scoring is much lower than it is in the arcade. Gobbling up all four ghosts totals just 300 points, and each video wafer is worth just 1 point.
The ghosts have tremendous amounts of flicker, and they all look and behave identically, instead of having different colors, distinct personalities, and eyes that pointed in the right direction. The flicker was there for a reason. Frye used it to draw the four ghosts in successive frames with a single sprite graphic register, and drew Pac-Man every frame using the other sprite graphic register. The 2600’s TIA chip synchronizes with an NTSC television picture 60 times per second, so you end up seeing a solid Pac-Man, maze, and video wafers (I can still barely type “video wafers” with a straight face), but the ghosts are each lit only one quarter of the time. A picture tube’s phosphorescent glow takes a little bit to fade, and your eye takes a little while to let go of a retained image as well, but the net result is that the flicker is still quite visible.
It gets worse. The janky, gritty sound effects are bizarre, and the theme song is reduced to four dissonant chords. (Oddly, these sounds resurfaced in some movies over the next 20 years and were a default “go-to” for sound designers working in post-production.) The horizontally stretched maze is nothing like the arcade, either, and the escape routes are at the top and bottom instead of the sides. The maze walls aren’t even blue; they’re orange, with a blue background, because it’s been reported Atari had a policy that only space games could have black backgrounds (!). At this point, don’t even ask about the lack of intermissions.
One of Frye’s own mistakes is that he made Pac-Man a two-player game. “Tod used a great deal of memory just tracking where each player had left off with eaten dots, power pellets, and score,” wrote Goldberg and Vendel in Atari Inc.: Business is Fun. Years later, when Frye looked at the code for the much more arcade-faithful 2600 Ms. Pac-Man, he saw the programmers were “able to use much more memory for graphics because it’s only a one player game.”
Interestingly, the game itself is still playable. Once you get past the initial huge letdown and just play it on its own merits, Pac-Man puts up a decent experience. It’s still “Pac-Man,” sort of, even if it delivers a rough approximation of the real thing as if it were seen and played through a straw. It’s worth playing today for nostalgia—after all, many of us played this cartridge to death anyway, because it was the one we had—and certainly as a historical curiosity for those who weren’t around for the golden age of arcades.
Many an Atari 2600 fan turned on the platform—and Atari in general—after the release of Pac-Man. Although the company still had plenty of excellent games and some of the best were yet to come, the betrayal was immediate and real and forever colored what much of the gaming public thought of Atari. The release of the Pac-Man cartridge didn’t curtail the 2600’s influence on the game industry by any means; we’ll visit many more innovations and developments as we go from here on out. But the 2600 conversion of Pac-Man gave the fledgling game industry its first template for how to botch a major title. It was the biggest release the Atari 2600 had and would ever see, and the company flubbed it about as hard as it could. It was New Coke before there was New Coke.
The next few games we’ll discuss further illustrate the quality improvements upstart third-party developers delivered, in comparison with Atari, which had clearly become too comfortable in its lead position. First up is Activision’s Grand Prix, which in hindsight was a bit of an odd way to design a racer . It’s a side-scroller on rails that runs from left to right, and is what racing enthusiasts call a time trial. Although other computer-controlled cars are on the track, you’re racing against the clock, not them, and you don’t earn any points or increase your position on track for passing them.
Gameplay oddities aside, the oversized Formula One cars are wonderfully detailed, with brilliant use of color and animated spinning tires. The shaded color objects were the centerpiece of the design, as programmer David Crane said in a 1984 interview. “When I developed the capability for doing a large multicolored object on the [2600’s] screen, the capability fitted the pattern of the top view of a Grand Prix race car, so I made a racing game out of it.” Getting the opposing cars to appear and disappear properly as they entered and exited the screen also presented a problem, as the 2600’s lack of a frame buffer came into play again. The way TIA works, the 2600 would normally just make the car sprite begin to reappear on the opposite side of the screen as it disappeared from one side. To solve this issue, Crane ended up storing small “slices” of the car in ROM, and in real time the game drew whatever portions of the car were required to reach the edge of the screen. The effect is smooth and impossible to detect while playing.
Shop On SecondCovers
.fejqi5b2beea7cb67e margin: 5px; padding: 0px; @media screen and (min-width: 1201px) .fejqi5b2beea7cb67e display: block; @media screen and (min-width: 993px) and (max-width: 1200px) .fejqi5b2beea7cb67e display: block; @media screen and (min-width: 769px) and (max-width: 992px) .fejqi5b2beea7cb67e display: block; @media screen and (min-width: 768px) and (max-width: 768px) .fejqi5b2beea7cb67e display: block; @media screen and (max-width: 767px) .fejqi5b2beea7cb67e display: block;
The car accelerates over a fairly long period of time, and steps through simulated gears. Eventually it reaches a maximum speed and engine note, and you just travel along at that until you brake, crash into another car, or reach the finish line. As the manual points out, you don’t have to worry about cars coming back and passing you again, even if you crash. Once you pass them, they’re gone from the race.
The four game variations in Grand Prix are named after famous courses that resonate with racing fans (Watkins Glen, Brands Hatch, Le Mans, and Monaco). The courses bear no resemblance to the real ones; each game variation is simply longer and harder than the last. The tree-lined courses are just patterns of vehicles that appear on screen. Whenever you play a particular game variation, you see the same cars at the same times (unless you crash, which disrupts the pattern momentarily). The higher three variations include bridges, which you have to quickly steer onto or risk crashing. During gameplay, you get a warning in the form of a series of oil slicks that a bridge is coming up soon.
Although Atari’s Indy 500 set the bar early for home racing games on the 2600, Grand Prix demonstrated you could do one with a scrolling course and much better graphics. This game set the stage for more ambitious offerings the following year. And several decades later, people play games like this on their phones. We just call titles like Super Mario Run (a side-scroller) and Temple Run (3D-perspective) “endless runners,” as they have running characters instead of cars.
Activision soon became the template for other competing third-party 2600 developers. In 1981, Atari’s marketing vice president and a group of developers, including the programmers for Asteroids and Space Invaders on the console, started a company called Imagic. The company had a total of nine employees at the outset. Its name was derived from the words “imagination” and “magic”—two key components of every cartridge the company planned to release. Imagic games were known for their high quality, distinctive chrome boxes and labels, and trapezoidal cartridge edges. As with Activision, most Imagic games were solid efforts with an incredible amount of polish and were well worth purchasing.
Although Imagic technically became the second third-party developer for the 2600, the company’s first game didn’t arrive until March 1982. Another company, Games by Apollo, beat it to the punch by starting up in October 1981 and delivering its first (mediocre) game, Skeet Shoot, before the end of the year.
But when that first Imagic game did arrive, everyone noticed.
At first glance, the visually striking Demon Attack looks kind of like a copy of the arcade game Phoenix, at least without the mothership screen (something it does gain in the Intellivision port). But the game comes into its own the more you play it. You’re stuck on the planet Krybor. Birdlike demons dart around and shoot clusters of lasers down toward you at the bottom of the screen. Your goal is to shoot the demons all out of the sky, wave after wave.
The playfield is mostly black, with a graded blue surface of the planet along the bottom of the screen. A pulsing, beating sound plays in the background. It increases in pitch the further you get into each level, only to pause and then start over with the next wave. The demons themselves are drawn beautifully, with finely detailed, colorful designs that are well animated and change from wave to wave. Every time you complete a wave, you get an extra life, to a maximum of six.
On later waves, the demons divide in two when shot, and are worth double the points. You can shoot the smaller demons, or just wait—eventually each one swoops down toward your laser cannon, back and forth until it reaches the bottom of the screen, at which point it disappears from the playfield. Shoot it while it’s diving and you get quadruple points. In the later stages, demons also shoot longer, faster clusters of lasers at your cannon.
The game is for one or two players, though there’s a cooperative mode that lets you take turns against the same waves of demons. There are also variations of the game that let you shoot faster lasers, as well as tracer shots that you can steer into the demons. After 84 waves, the game ends with a blank screen, though reportedly a later run of this cartridge eliminates that and lets you play indefinitely. If I were still nine years old, I could probably take a couple of days out of summer and see if this is true. I am no longer nine years old.
Demon Attack was one of Imagic’s first three games, along with Trick Shot and Star Voyager. Rob Fulop, originally of Atari fame and one of Imagic’s four founders, programmed Demon Attack. In November 1982, Atari sued Imagic because of Demon Attack’s similarity to Phoenix, the home rights of which Atari had purchased from Centuri. The case was eventually settled. Billboard magazine listed Demon Attack as one of the 10 best-selling games of 1982. It was also Imagic’s best-selling title, and Electronic Games magazine awarded it Game of the Year.
“The trick to the Demon Attack graphics was it was the first game to use my Scotch-taped/rubber-banded dedicated 2600 sprite animation authoring tool that ran on the Atari 800,” Fulop said in 1993. “The first time Michael Becker made a little test animation and we ran Bob Smith’s utility that successfully squirted his saved sprite data straight into the Demon Attack assembly code and it looked the same on the [2600] as it did on the 800 was HUGE! Before that day, all 2600 graphics ever seen were made using a #2 pencil, a sheet of graph paper, a lot of erasing, and a list of hex codes that were then retyped into the source assembly code, typically introducing a minimum of two pixel errors per eight-by-eight graphic stamp.”
Although you can draw a line from Space Invaders to just about any game like this, Demon Attack combines that with elements of Galaga and Phoenix, with a beautiful look and superb gameplay all its own.
A watershed moment in video game history, David Crane’s Pitfall! was one of the best games released for the 2600. As Pitfall Harry, your goal is to race through the jungle and collect 32 treasures—money bags, silver bars, gold bars, and diamond rings, worth from 2,000 to 5,000 points each. Jump and grab vines, and you soar over lakes, quicksand, and alligators, complete with a Tarzan-style “yell.” You can stumble on a rolling log or fall into a hole, both of which just dock you some points. Each time you fall into quicksand or a tar pit, drown in a lake, burn in a fire, or get eaten by an alligator or scorpion, you lose a life. When that happens, you start the next one by dropping from the trees on the left side of the screen to keep playing.
Pushing the joystick left or right makes Pitfall Harry run. He picks up treasure automatically. Holding the stick in either direction while pressing the button makes him jump, either over an obstacle or onto a swinging vine (running into the vine without jumping also works). Push down while swinging to let go of the vine. You also can push up or down to climb ladders.
In an incredible feat of programming, the game contains 255 screens, with the 32 treasures scattered throughout them. The world loops around once you reach the last screen. Although Adventure pioneered the multiroom map on the 2600, Pitfall! was a considerably larger design. Crane fit the game into the same 4KB ROM as Adventure. But rather than storing all 255 screens as part of the ROM—which wouldn’t have fit—Crane’s solution was not to store the world in ROM at all. Instead, the world is generated by code, the same way each time. This is similar to games like Rogue, but even in that case, the game generates the world and then stores it during play. Pitfall! generates each screen via an algorithm, using a counter that increments in a pseudorandom sequence that is nonetheless consistent and can be run forwards or backwards. The 8 bits of each number in the counter sequence define the way the board looks. Bits 0 through 2 are object patterns, bits 3 through 5 are ground patterns, bits 6 and 7 cover the trees, and bit 7 also affects the underground pattern. This way, the world is generated the same way each and every single time. When you leave one screen, you always end up on the same next screen.
“The game was a jewel, a perfect world incised in a mere [4KB] of code,” Nick Montfort wrote in 2001 in Supercade: A Visual History of the Videogame Age, 1971-1984.
You get a total of three lives, and Crane points out in the manual that you need to use some of the underground passages (which skip three screens ahead instead of one) to complete the game on time. The inclusion of two on-screen levels—above ground and below ground, with ladders connecting them—makes the game an official platformer. And the game even gives you some say in where to go and what path you take to get there. Pitfall Harry is smoothly animated, and the vines deliver a genuine sensation of swinging even though the game is in 2D.
The game’s 20-minute timer, which approximates the 22-minute length of a standard half-hour television show, marked a milestone for console play. It was much longer than most arcade games and even cartridges like Adventure, which you could complete in a few minutes. The extra length allows for more in-depth play.
“Games in the early ’80s primarily used inanimate objects as main characters,” Crane said in a 2011 interview. “Rarely there would be a person, but even those weren’t fully articulated. I wanted to make a game character that could run, jump, climb, and otherwise interact with an on-screen world.” Crane spent the next couple of years tinkering with the idea before finally coming up with Pitfall!. “[After] only about 10 minutes I had a sketch of a man running on a path through the jungle collecting treasures. Then, after ‘only’ 1,000 hours of pixel drawing and programming, Pitfall Harry came to life.”
Crane said he had already gone beyond that 4KB ROM limit and back within it many times over hundreds of hours. Right before release, he was asked to add additional lives. “Now I had to add a display to show your number of lives remaining, and I had to bring in a new character when a new life was used.” The latter was easy, Crane said, because Pitfall Harry already knew how to fall and stop when he hit the ground. Crane just dropped him from behind the tree cover. “For the ‘Lives’ indicator I added vertical tally marks to the timer display. That probably only cost 24 bytes, and with another 20 hours of ‘scrunching’ the code I could fit that in.”
Pitfall! couldn’t have been timed more perfectly, as Raiders of the Lost Ark was the prior year’s biggest movie. The cartridge delivered the goods; it became the best-selling home video game of 1982 and it’s often credited as the game that kickstarted the platformer genre. Pitfall! held the top spot on Billboard’s chart for 64 consecutive weeks. “The fine graphic sense of the Activision design team greatly enriches the Pitfall! experience,” Electronic Games magazine wrote in January 1983, on bestowing the cartridge Best Adventure Videogame. “This is as richly complex a video game as you’ll find anywhere…Watching Harry swing across a quicksand pit on a slender vine while crocodiles snap their jaws frantically in a futile effort to tear off a little leg-of-hero snack is what video game adventures are all about.” Pitfall!’s influence is impossible to overstate. From Super Mario Bros. to Prince of Persia to Tomb Raider, it was the start of something huge.
0 notes
kidsviral-blog · 6 years
Text
I'm Mending My Broken Relationship With Food
New Post has been published on https://kidsviral.info/im-mending-my-broken-relationship-with-food/
I'm Mending My Broken Relationship With Food
After a lifetime struggling with disordered eating, I’m still figuring out how to have a healthy relationship with my body and what I feed it.
View this image ›
Justine Zwiebel / BuzzFeed
It’s a late night in winter, and I am standing over my gas stove heating a metal spoon. I hold the handle gently in my fingers, carefully rotating the bowl over the tips of the indigo flames as the pale yellow pat of Smart Balance butter inside begins to liquefy. The sleeves of my oversized sweatshirt graze the middle of my palms and I step on the hem of my baggy sweatpants as, slowly, I pull the spoon away. A tiny drop of hot liquid falls on my toes as I tip its contents over the edge of a plain white bowl filled with sugar. I add flour, some milk, a few drops of vanilla, and a handful of chocolate chips. I stir. I taste.
I take the bowl to the couch, balance it precariously on the edge, and lie down on my side, my fingers the only utensil, pinching stray sugary flecks off the velvet dark gray fabric as The Real Housewives of New Jersey blares on the TV. It’s been nearly three years since a therapist told me I’m a disordered eater. Yet, after one personal trainer, over two years of therapy, three juice cleanses, four gym memberships, 20 pounds lost, 30 pounds gained back, and thousands of dollars spent on healthy groceries and high-end cookware, I am 24 years old and spending another night, like so many nights before, eating a bowl of last-minute, mediocre cookie dough alone in my apartment at 11 p.m. And I hate myself for it.
View this image ›
Justine Zwiebel / BuzzFeed
I’ve been overweight — or bordering on it — nearly my entire life, at least since my family moved to the U.S. when I was 4. When I was a child, a routine fight between my Hungarian mother and me was over how much I ate for dinner. Propping my elbows on our scratched dining table, I’d watch her petite, pale hands hovering above me, ladling spoon upon spoon of rice on my father’s plate. “NO FAIR, DAD GOT THE BIGGER ONE,” I’d cry out when my own would finally land, unable to grasp why a 5-foot-10-inch, 200-plus-pound Nigerian man would need to eat more than I did. Seconds, for me, were a must. Thirds weren’t unusual.
Growing up in a white, affluent neighborhood in Lubbock, Texas, I was the only Anita in a sea of Amandas, Brittanys, and Tiffanys. I was biracial, brown and round, with a puffy ball of hair that sat squarely banded in the middle in my head. The boys called it a “burnt marshmallow” and “tumor.” Isolated and othered, I began using food as a coping mechanism around middle school, when my parents began letting me walk home (across the street) alone. I’d spend the two hours until my mom got off work by myself. My best friends had “boyfriends” in the way suburban preteens can — notes, stuffed animals, dates at the roller rink on school skate night. I had a gallon of Edy’s chocolate chip waiting in the freezer for me each day.
Eventually, my mom realized I was sneaking food and she started hiding sweets in the kitchen in hopes of curbing my steady weight gain. Instead, I became an expert at climbing on countertops, calculating how much I could eat of something before she would notice, and burying wrappers in the trash. Often, I’d throw away the balanced, nutritious lunches she’d pack me — whole wheat wraps and sandwiches, fruits, veggies, hard-boiled eggs — in favor of pizza and curly fries. “You ate your lunch today, right?” she’d ask cautiously, waiting for the “yes” we both knew was a lie. She was careful not to tie my weight to my worth, but rather reminded me constantly that what I was doing wasn’t healthy. Looking back, I can’t blame her, but at the time I felt betrayed. Though I couldn’t articulate it then, taking those foods away from me was taking away the one thing that made me feel like I wasn’t alone. I was already the chubby black girl; I didn’t want to be the chubby black girl on a diet.
View this image ›
Justine Zwiebel / BuzzFeed
As I grew older, I prided myself on being good. I volunteered. I got straight A’s. I didn’t drink, smoke, have sex, or do drugs. But I ate.
What had begun as a way of burying my insecurities morphed into a way of self-medicating full-blown depression and anxiety. Food was my salve and my secret. By the time I was a high schooler in Arkansas, where we had moved when I was 14, I was regularly driving through the local Chinese restaurant, eating crab rangoon alone in my car in the parking lot of an abandoned strip mall. Overwhelmed by a laundry list of extracurriculars that I hoped would get me into the “right college” — student council, cheerleading, theatre, National Honor Society, Key Club, jazz, tap, ballet — I ate until I was too full to worry. When I was cast in my senior musical, I ran to my car after last bell and sped up the highway to Sonic to buy Cinnasnacks (think mini-cinnamon rolls, but more gross) and a cherry limeade in the half hour before first rehearsal. I realized what was happening wasn’t normal when I thought more about what I’d eat when I got to my friends’ houses than the time I’d spend with them.
At the time, I tried to figure out what was wrong with me the same way I tried to find solutions to all of my problems as a teen: magazines. Yet, in article upon article, all I saw were stock images of thin white girls with whom I seemed to have nothing in common. I was obviously not anorexic. I never could throw up after eating, though god knows I tried, so bulimia was out. And while my habits were definitely in line with bingeing, which wasn’t recognized as its own disorder until 2013, I never felt like I ate quite enough to qualify. I had a tendency to buy a lot of things on impulse, take a few bites, then throw them away. I once read somewhere that Lindsay Lohan poured water on her food after she was full so she’d stop eating; I’d subsequently watched many half-eaten tubs of ice cream swirl down the drain.
I hoped going to my dream college would somehow absolve me of my lack of self-worth and, with that, my eating habits. Instead, I spent much of my freshman and sophomore years at Brown feeling like a fraud and making full use of my unlimited meal plan by stuffing to-go containers and eating alone in my dorm room.
Eventually, I began seeing a therapist, who diagnosed me with dysthymia — a low-grade, chronic form of depression — and generalized anxiety disorder. I also began seeing a personal trainer. By senior year, my body finally felt like it fit my 5-foot-2-inch frame. I spoke in class like what I had to say actually mattered. Instead of ruminating alone and in doubt, I opened up to friends and socialized. I went on spring break in Florida and took pictures in a bikini for the first time ever. I felt more in control of my life than I ever thought I could. I was finally, finally, happy.
View this image ›
Justine Zwiebel / BuzzFeed
But, despite my progress, there was one hurdle for which I couldn’t shake my anxiety: finding a job. An aspiring journalist, I had carefully checked off all the necessary boxes — writing courses, writing and editing for campus publications, three internships — but was terrified of rejection. So instead, I joined Teach for America after graduating in 2012, rationalizing it as a necessary experience to one day write about social justice issues. After a few months teaching third grade at a charter school north of Providence, I was miserable. Inexperienced and ill-equipped to handle the needs of my students, I began yo-yoing between jars of baby food that I’d eat as meals and cartons of Chinese food and quickly gained back half the weight I’d previously lost.
So, I finally sought out a second therapist who specialized in weight and body issues.
“The only reason you felt happy your senior year is because you were thin,” she told me during one of our first sessions. It was then when I learned the name for what I’d been struggling with my entire life: disordered eating, in my case chronic enough that it was periodically a full-blown, though unspecified, eating disorder (the distinction between the two is the frequency and severity of patterns). My therapist coaxed me to recognize how my entire identity and self-esteem seemed dependent on what was on my plate at any given moment. She pointed out that even when I had felt my best, I was undercounting calories, considering a couple dozen spears of asparagus or a couple of eggs to be adequate dinners, despite running regular 5Ks at the time. Instead of becoming healthier during college, I had swung from one extreme to the other. Now I was bouncing back and forth between the two.
Yet, as thankful as I was to have a more concrete understanding of what was going on with me, I rejected her theory. After all, I thought, much more had changed that year than just my weight and diet. The real problem was my job. The real problem was Rhode Island. So, I quit and I left. And, like a bad movie on loop, within a few months in New York I was juice cleansing and takeout bingeing, with a job at a fashion magazine where I was thankful for a cubicle so that that no one could see me inhale the finest Midtown’s hot buffet delis had to offer. Then, for a host of reasons, I quit that job after half a year and spent my “funemployment” obsessively looking for another one, watching all of Breaking Bad, and ordering Seamless at midnight.
Pause. Play. Rewind. Repeat.
View this image ›
Justine Zwiebel / BuzzFeed
I’m now nearing the end of my second year in New York, and by and large my life has begun to stabilize. I’ve moved out of a claustrophobic apartment I shared with roommates when I first got to the city into one of my own, and have both a job and a boyfriend I love. I cook more and, overall, eat much better, often Instagramming the meals I’m most proud to have made.
And yet — two weekends ago, I visited my parents in Arkansas, and it went badly: My boyfriend and I were fighting, the flights were changed because of bad weather. Exhausted, I spent much of my airport layover on the way back to NYC agonizing over what to eat, wanting nothing more than to drown myself in a combo plate at the King Wah Express, yet ultimately settling on a sensible salad from the glaringly obvious sensible salad place (“green to greens…” “earth fresh…”). The canned salmon was too pale, the dressing too much like something out of a Kraft bottle, and I was too aware of being the overweight woman eating a salad. I pushed it over to the side and grabbed my wallet. After another lap around the food court, I was back in front of King Wah Express.
“How much is just a side of lo mein?” I asked the woman behind the counter.
“$4.99.”
It wasn’t a lot, but I was frustrated that I’d already spent $13 on something that was going in the trash. I changed course.
“I’ll take two crab rangoon, please.”
I sat back down and ate them my usual way: crispy corners first, then soft, squishy middle full of filling. As I dribbled duck sauce out of individual packets and wiped grease off my fingers, I wondered, like so many times before, if my eating habits will — can — ever really sustainably change. I pulled up the waistband of my leggings, aware of the strings already unraveling at the seams in the thigh and that I’d just bought them a little over a month ago. Packing for this trip was easy; I am at the heaviest I’ve ever been and most of my clothes didn’t fit anyway.
The last time I ate crab rangoon, it was 2013 and I was still living in Rhode Island. After failing to go to the YMCA that was across the street from my apartment, I had purchased a membership at a discount gym in a small town 10 minutes away because, somehow, that seemed like a better motivator than a building I could literally stare at out of my bedroom window. I can count the number of times I went to that gym on two hands and have few memories of it, but I do remember the Chinese buffet that was in the shopping center next door. I went to it twice: one time to eat inside, in a pleather booth near a couple and their annoying kids, the other to eat takeout, in a red plastic Ikea chair in my kitchen.
I can’t believe I am fucking here. Again. I thought, as I thumbed crumbs off the airport table.
But that was two weeks ago.
I’ve come to realize I eat the same way I hit my snooze button every morning: just a little bit more. Tired when I should feel energized, so empty despite being so full. Food is still the first thing I think about when I wake up and the last thing I think about before I go to bed. I still spend much of my time trying to hide just how much I eat it. After nine months in my own place, I’ve yet to buy my own microwave, hoping the lack of ease with which I can heat things will keep me from eating myself out of control. I’ve also yet to find a therapist in the city, an endeavor I’ve embarked on most weeks since I moved here and feel wholly overwhelmed by. However, I’m slowly, finally, acknowledging that my disordered eating — though inextricably intertwined with other issues — is also its own source of unhappiness, rather than a symptom of it.
And now I’m trying a new routine. Today was my fourth day starting my morning curled on my couch, sipping a cup of tea before I reach for the handle of the fridge. Before I leave my apartment, I pack lunch — a proper serving of “pad thai” made with spaghetti squash and shrimp, which I relished making earlier in the week, plus blueberries — in a plastic teal bento box with dorky handles. I feel equal parts embarrassed and ecstatic about carrying it on the subway and into my office, mindful of what my co-workers might think of such a marked departure from the spread of constant, countless snacks I’ve carted to my desk, but knowing after I’ve finished what’s inside, I’ll feel better somehow. This time, I won’t throw it away.
Resources
If you or someone you know is struggling with an eating disorder, here are some organizations that have trained support staff available by phone:
National Association of Anorexia Nervosa and Associated Disorders Helpline: 1-630-577-1330
Binge Eating Disorder Association Helpline: 1-855-855-BEDA
National Eating Disorder Association Helpline: 1-800-931-2237
Read more: http://www.buzzfeed.com/anitabadejo/confessions-of-a-disordered-eater
0 notes
marvyn-reads · 7 years
Link
In October 1957, the Soviet Union launched the Earth’s first artificial satellite, Sputnik 1. The craft was no bigger than a beach ball, but it spurred the US into a frenzy of research and investment that would eventually put humans on the Moon. Sixty years later, the world might have had its second “Sputnik moment.” But this time, it’s not the US receiving the wake-up call, but China; and the goal is not the exploration of space, but the creation of artificial intelligence.
The second Sputnik arrived in the form of AlphaGo, the AI system developed by Google-owned DeepMind. In 2016, AlphaGo beat South Korean master Lee Se-dol at the ancient Chinese board game Go, and in May this year, it toppled the Chinese world champion, Ke Jie. Two professors who consult with the Chinese government on AI policy told The New York Times that these games galvanized the country’s politicians to invest in the technology. And the report the pair helped shape — published last month — makes China’s ambitions in this area clear: the country says it will become the world’s leader in AI by 2030.
“It’s a very realistic ambition,” Anthony Mullen, a director of research at analyst firm Gartner, tells The Verge. “Right now, AI is a two-horse race between China and the US.” And, says Mullen, China has all the ingredients it needs to move into first. These include government funding, a massive population, a lively research community, and a society that seems primed for technological change. And it all invites the trillion-dollar question: in the coming AI Race, can China really beat the US?
Tumblr media
AlphaGo was a “Sputnik moment” for Chinese politicians, triggering new investment by the state in AI research. Photo: Google
Strength in numbers
To build great AI, you need data, and nothing produces data quite like humans. This mean’s China’s massive 1.4 billion population (including some 730 million internet users) might be its biggest advantage. These citizens produce reams of useful information that can be mined by the country’s tech giants, and China is also significantly more permissive when it comes to users’ privacy. For the purposes of building AI, this compares favorably with European countries and their “citizen-centric legislation,” says Mullen. Companies like Apple and Google are designing workarounds for this privacy problem, but it’s simpler not to bother in the first place.
China’s 1.4 billion population is a data gold mine for building AI
In China, this also means that AI is being deployed in ways that might not be acceptable in the West. For example, facial recognition technology is used for everything from identifying jaywalkers to dispensing toilet paper. These implementations seem trivial, but as any researcher will tell you, there’s no substitute for deploying tech in the wild for testing and developing. “I don’t think China will have the same level of existential crisis about the development of AI that the West will have,” says Mullen.
The adventures of Microsoft chatbots in China and the US make for a good comparison. In China, the company’s Xiaoice bot, which is downloadable as an app, has more than 40 million users, with regulars talking to it every night. It even published a book of poetry under a pseudonym, sparking a debate in the country about artificial creativity. By comparison, the American version of the bot, named Tay, was famously shut down in a matter of days after Twitter users taught it to be racist.
Matt Scott, CTO of Beijing machine vision startup Malong Technologies, says China’s attitude toward new technology can be “risk-taking” in a bracing way. “For AI you have to be at the cutting edge,” he says. “If you’re using technology that’s one year old, you’re outdated. And I definitely find that in China — at least, my community in China — is very adept at taking on these risks.”
A culture of collaboration
The output of China’s AI research community is, in some ways, easy to gauge. A report from the White House in October 2016 noted that China now publishes more journal articles on deep learning than the US, while AI-related patent submissions from Chinese researchers have increased 200 percent in recent years. The clout of the Chinese AI community is such that at the beginning of the year, the Association for the Advancement of Artificial Intelligence rescheduled the date of its annual meeting; the original had fallen on Chinese New Year.
What’s trickier, though, is knowing how these numbers translate to scientific achievement. Paul Scharre, a researcher at the think tank Center for a New American Security, is skeptical about statistics. “You can count the number of papers, but that’s sort of the worst possible metric, because it doesn’t tell you anything about quality,” he says. “At the moment, the real cutting-edge research is still being done by institutions like Google Brain, OpenAI, and DeepMind.”
In China, though, there is more collaboration between firms like these and universities and government — something that could be beneficial in the long term. Scott’s Malong Technologies runs a joint research lab with Tsinghua University, and there are much bigger partnerships like the “national laboratory for deep learning” run by Baidu and the Chinese government’s National Development and Reform agency.
Other aspects of research seem influential, but are difficult to gauge. Scott, who started working in machine learning 10 years ago with Microsoft, suggests that China has a particularly open AI community. “I think there is a bit more emphasis on [personal] relationships,” he says, adding that China’s ubiquitous messaging app WeChat is a rich resource, with chat groups centered around universities and companies sharing and discussing new research. “The AI communities are very, very alive,” he says. “I would say that WeChat as a vehicle for spreading information is highly effective.”
Tumblr media
Government agencies like DARPA still fund lots of AI and robotics research, but proposed funding cuts are troubling. Photo by Chip Somodevilla / Getty Images
Remember: the government helped make the internet
What most worries Scharre is the US government’s current plans to retreat from basic science. The Trump administration’s proposed budget would slash funding for research, taking money away from a number of agencies whose work could involve AI. “Clearly [Washington doesn’t] have any strategic plan to revitalize American investment in science and technology,” Scharre tells The Verge. “I am deeply troubled by the range of cuts that the Trump administration is planning. I think they’re alarming and counterproductive.”
Trump’s administration could never be called “science-friendly”
The previous administration was aware of the dangers and potential of artificial intelligence. Two reports published by the Obama White house late last year spelled out the need to invest in AI, as well as touching on topics like regulation and the labor market. “AI holds the potential to be a major driver of economic growth and social progress,” said the October report, noting that “public- and private-sector investments in basic and applied R&D on AI have already begun reaping major benefits.”
In some ways, China’s July policy paper on AI mirrors this one, but China didn’t just go through a dramatic political upheaval that threatens to change its course. The Chinese policy paper says that by 2020 it wants to be on par with the world’s finest; by 2025 AI should be the primary driver for Chinese industry; and by 2030, it should “occupy the commanding heights of AI technology.” According to a recent report from The Economist, having the high ground will pay off, with consultancy firm PwC predicting that AI-related growth will lift the global economy by $16 trillion by 2030 — with half of that benefit landing in China.
Tumblr media
Where do we go from here?
For Scharre, who recently wrote a report on the threat AI poses to national security, the US government is laboring under a delusion. “A lot of people take it for granted that the US builds the best tech in the world, and I think that’s a dangerous assumption to make,” he says, saying that a wake-up call is due. China may have had the “Sputnik moment” it needed to back AI, but has the US?
Others question whether this is necessary. Mullen says that while the momentum to be the world leader in AI currently lies with China, the US is still marginally ahead, thanks to the work of Silicon Valley. Scharre agrees, and says that government funding isn’t that big of an issue while US tech giants are able to redirect just a little of their ad money to AI. “Money you get from somewhere like DARPA is just a drop in the ocean compared to what you can get from the likes of Google and Facebook,” he says.
These companies also provide a counterpoint to the argument that China’s demographics give it an unmatchable advantage. It’s certainly good to have a huge number of users in one country, but it’s probably better to have that same number of users spread across the world. Both Facebook and Google have more than 2 billion people hooked on to their primary platforms (Facebook itself and Android) as well as a half-dozen other services with a billion-plus users. It’s arguable that this sort of reach is more useful, as it provides an abundance of data, as well as diversity. China’s tech companies may be formidable, but they lack this international reach.
Tumblr media
US tech companies like Google have international reach that Chinese firms can’t match. Photo by Justin Sullivan / Getty Images
Scharre suggests this is important, because when it comes to measuring progress in AI, on-the-ground implementations are worth more than research. What counts, he says, is “the ability of nations and organizations to effectively implement AI technologies. Look at things like using AI in healthcare diagnoses, in self-driving cars, in finance. It’s fine to be, say, 12 months behind in research terms, as long as you can still get ahold of the technology and use it effectively.”
In that sense, the AI race doesn’t have to be zero sum. Right now, cutting-edge research is developed in secret, but shared openly across borders. Scott, who has worked in the field in both the US and China, says the countries have more in common than they think. “People are afraid that this is something happening in some basement lab somewhere, but it’s not true,” he says. “The most advanced technology in AI is published, and countries are actively collaborating. AI doesn’t work in a vacuum: you need to be collaborative.”
In some ways, this is similar to the situation in 1957. When news of Sputnik’s launch first broke, there was an air of scientific respect, despite the the geopolitical rivalry between the US and USSR. A contemporary report said that America’s top scientists “showed no rancor at being beaten into space by the Soviet engineers, and, as one of them put it, ‘We are all elated that it is up there.’”
Throughout the ‘60s and early ‘70s, America and Russia jockeyed back and forth to be “first” in the space race. But in the end, the benefits of this competition — new scientific knowledge, technology, and culture — didn’t just go to the winner. They were shared more evenly than that. By this metric, a Sputnik moment doesn’t have to be cause for alarm, and the race to build better AI could still benefit us all.
via The Verge
0 notes