#optimization
Explore tagged Tumblr posts
Quote
We optimize ourselves to death. Relentless self-exploitation leads to mental collapse. Brutal competition ends in destruction. It produced an emotional coldness and indifference towards others as well as towards one's own self.
Byung-chul Han, Capitalism and the Death Drive
#optimization#measurement#quantity#exploitation#competition#capitalism#self#other#quotes#Han#Byung-chul Han#Capitalism and the Death Drive
436 notes
·
View notes
Text
"Making systems resilient is fundamentally at odds with optimization, because optimizing a system means taking out any slack. A truly optimized, and thus efficient, system is only possible with near-perfect knowledge about the system, together with the ability to observe and implement a response. For a system to be reliable, on the other hand, there have to be some unused resources to draw on when the unexpected happens, which, well, happens predictably." (Deb Chachra, How Infrastructure Works, p. 209) Another way to look at this is that you cannot optimize for resilience. Resilience requires a kind of elasticity, an ability to stretch and reach but then to return, to spring back into a former shape—or perhaps to shapeshift into something new if the circumstances require it. Resilience is stretchy where optimization is brittle; resilience invites change where optimization demands continuity.
—Mandy Brown, from her post "Against Optimization" on A Working Library
202 notes
·
View notes
Text
Game Optimization and Production
I wanted to write a bit of a light primer about optimization and how it relates to game production in the event people just don't know how it works, based on my experience as a dev. I'm by no means an expert in optimization myself, but I've done enough of it on my own titles and planned around it enough at this point to understand the gist of what it comes down to and considerations therein. Spoilers: games being unoptimized are rarely because devs are lazy, and more because games are incredibly hard to make and studios are notoriously cheap.
(As an aside, this was largely prompted by seeing someone complaining about how "modern" game developers are 'lazy' because "they don't remember their N64/Gamecube/Wii/PS2 or PS3 dropping frames". I feel compelled to remind people that 'I don't remember' is often the key part of the "old consoles didn't lag" equation, because early console titles ABSOLUTELY dropped frames and way more frequently and intensely than many modern consoles do. Honestly I'd be willing to bet that big budget games on average have become more stable over time. Honorable mention to this thread of people saying "Oh yeah the N64 is laggy as all hell" :') )
Anywho, here goes!
Optimization
The reason games suffer performance problems isn't because game developers are phoning it in or half-assing it (which is always a bad-faith statement when most devs work in unrealistic deadlines, for barely enough pay, under crunch conditions). Optimization issues like frame drops are often because of factors like ~hardware fragmentation~ and how that relates to the realities of game production.
I think the general public sees "optimization" as "Oh the dev decided to do a lazy implementation of a feature instead of a good one" or "this game has bugs", which is very broad and often very misguided. Optimization is effectively expanding the performance of a game to be performance-acceptable to the maximum amount of people - this can be by various factors that are different for every game and its specific contexts, from lowering shader passes, refactoring scripts, or just plain re-doing work in a more efficient way. Rarely is it just one or two things, and it's informed by many factors which vary wildly between projects.
However, the root cause why any of this is necessary in the first place is something called "Platform Fragmentation".
What Is Fragmentation
"Fragmentation" is the possibility space of variation within hardware being used to run a game. Basically, the likelihood that a user is playing a game on a different hardware than the one you're testing on - if two users are playing your game on different hardware, they are 'fragmented' from one another.
As an example, here's a graphic that shows the fragmentation of mobile devices based on model and user share. The different sizes are how many users are using a different type of model of phone:
As you can tell, that's a lot of different devices to have to build for!
So how does this matter?
For PC game developers, fragmentation means that an end-user's setup is virtually impossible to predict, because PC users frequently customize and change their hardware. Most PC users potentially have completely different hardware entirely.
Is your player using an up-to-date GPU? CPU? How much RAM do they have? Are they playing on a notebook? A gaming laptop? What brand hardware are they using? How much storage space is free? What OS are they using? How are they using input?
Moreover PC parts don't often get "sunsetted" whole-cloth like old consoles do, so there's also the factor of having to support hardware that could be coming up on 5, 10 or 15 years old in some cases.
For console developers it's a little easier - you generally know exactly what hardware you're building for, and you're often testing directly on a version of the console itself. This is a big reason why Nintendo's first party titles feel so smooth - because they only build for their own systems, and know exactly what they're building for at all times. The biggest unknowns are usually smaller things like televisions and hookups therein, but the big stuff is largely very predictable. They're building for architecture that they also made themselves, which makes them incredibly privileged production-wise!
Fragmentation basically means that it's difficult - or nearly impossible - for a developer to know exactly what their users are playing their games on, and even more challenging to guarantee their game is compatible everywhere.
Benchmarking
Since fragmentation makes it very difficult to build for absolutely everybody, at some point during development every developer has to draw a line in the sand and say "Okay, [x] combination of hardware components is what we're going to test on", and prioritize that calibre of setup before everything else. This is both to make testing easier (so testers don't have to play the game on every single variation of hardware), and also to assist in optimization planning. This is a "benchmark".
Usually the benchmark requirements are chosen for balancing visual fidelity, gameplay, and percentage of the market you're aiming for, among other considerations. Often for a game that is cross-platform for both PC and console, this benchmark will be informed by the console requirements in some way, which often set the bar for a target market (a cross-platform PC and console game isn't going to set a benchmark that is impossible for a console to play, though it might push the limits if PC users are the priority market). Sometimes games hit their target benchmarks, sometimes they don't - as with anything in game development it can be a real crap shoot.
In my case for my games which are often graphically intensive and poorly made by myself alone, my benchmark is often a machine that is approximately ~5 years old and I usually take measures to avoid practices which are generally bad and can build up to become very expensive over time. Bigger studios with more people aiming at modern targets will likely prioritize hardware from within the last couple years to have their games look the best for users with newest hardware - after all, other users will often catch up as hardware evolves.
This benchmark allows devs to have breathing room from the fragmentation problem. If the game works on weaker machines - great! If it doesn't - that's fine, we can add options to lower quality settings so it will. In the worst case, we can ignore it. After all, minimum requirements exist for a reason - a known evil in game development is not everyone will be able to run your game.
Making The Game
As with any game, the more time you spend on something is the more money being spent on it - in some cases, extensive optimization isn't worth the return of investment. A line needs to be drawn and at some point everyone can't play your game on everything, so throwing in the towel and saying "this isn't great, but it's good enough to ship" needs to be done if the game is going to ship at all.
Optimizing to make sure that the 0.1% of users with specific hardware can play your game probably isn't worth spending a week on the work. Frankly, once you hit a certain point some of those concerns are easier put off until post-launch when you know how much engagement your game has, how many users of certain hardware are actually playing, and how much time/budget you have to spend post-launch on improving the game for them. Especially in this "Games As A Service" market, people are frequently expecting games to receive constant updates on things like performance after launch, so there's always more time to push changes and smooth things out as time goes on. Studios are also notoriously squirrelly with money, and many would rather get a game out into paying customer's hands than sit around making sure that everything is fine-tuned (in contrast to most developers who would rather the game they've worked on for years be fine-tuned than not).
Comparatively to the pre-Day One patch era; once you printed a game on a disc it is there forever and there's no improving it or turning back. A frightening prospect which resulted in lots of games just straight up getting recalled because they featured bugs or things that didn't work. 😬
Point is though, targeted optimization happens as part of development process, and optimization in general often something every team helps out with organically as production goes on - level designers refactor scripts to be more efficient, graphics programmers update shaders to cut down on passes, artists trim out poly counts where they can to gradually achieve better performance. It's an all-hands-on-deck sort of approach that affects all devs, and often something that is progressively tracked as development rolls on, as a few small things can add up to larger performance issues.
In large studios, every developer is in charge of optimizing their own content to some extent, and some performance teams are often formed to be dedicated to finding the easiest, safest and quickest optimization wins. Unless you plan smartly in the beginning, some optimizations can also just be deemed to dangerous and out-of-reach to carry out late in production, as they may have dependencies or risk compromising core build stability - at the end of the day more frames aren't worth a crashing game.
Conclusion
Games suffer from performance issues because video game production is immensely complex and there's a lot of different shifting factors that inform when, how, and why a game might be optimized a certain way. Optimization is frequently a production consideration as much as a development one, and it's disingenuous to imply that games lag because developers are lazy.
I think it's worth emphasizing that if optimization doesn't happen, isn't accommodated, or perhaps is undervalued as part of the process it's rarely if ever because the developers didn't want to do it; rather, it's because it cost the studio too much money. As with everything in our industry, the company is the one calling the final shots in development. If a part of a game seems to have fallen behind in development it's often because the studio deemed it acceptable, refused to move deadlines or extend a hand to help it come together better at fear of spending more money on it. Rarely if ever should individual developers be held accountable for the failings of companies!
Anywho, thanks for reading! I know optimization is a weird mystical sort of blind spot for a lot of dev folks, so I hope this at least helps shed some light on considerations that weigh in as part of the process on that :) I've been meaning to write a more practical workshop-style step-by-step on how to profile and spot optimization wins at some point in the future, but haven't had the time for it - hopefully I can spin something up in the next few weeks!
#gamedev#game development#game dev#indie games#indie game#gamedevelopment#indiegames#pc gaming#pc games#indie dev#indiedev#video games#video game#blog#thoughts#optimization
88 notes
·
View notes
Note
You have talked at length at standard optimizations (LoD, not rendering things, deleting cars on a distant highway to avoid memory overflow, etc). What are the weirdest optimization strategies you have seen?
One of my favorites was detecting animated background characters on screen, skipping 80% of their animation updates, and instead interpolating between the animation frames during the missing frame updates. Since they were just auto-looping background characters and not often looked at, most players didn't notice a difference but it saved a lot on the animation calculation overall.
[Join us on Discord] and/or [Support us on Patreon]
Got a burning question you want answered?
Short questions: Ask a Game Dev on Twitter
Long questions: Ask a Game Dev on Tumblr
Frequent Questions: The FAQ
29 notes
·
View notes
Text
youtube
Yeah, I needed someone to break down why this remake was killing standard gpus. Optimization matters.
I know this practice of offloading graphics processing to consumer hardware mostly occurs because publishers don't want to pay for optimization. This is ridiculous considering that the game costs $70 + a fraction of your gpus life span.
SH2 remake was pretty good, but I'm not letting triple A development get away with stuff like this. Going through multiple hard crashes left me feeling bitter about the experience so I have to leave a negative review until it is fully optimized.
10 notes
·
View notes
Text
Testing out performance with 492 enemies using path finding. They tell a central command object when they need new orders and "get in line" for where to go, meaning only one path is calculated per step. That's way more enemies than I intend to have active at once so I'm happy with it. My focus for this prototype is mostly to play with NPC behavior (I'd have said AI but I don't want the confusion), as that and optimization are what interest me the most. The goal is for "fast combat with stealth elements" gameplay, meaning lots of running and gunning, but using the map to your advantage to flank, lure, etc. Enemies will patrol and switch between alert levels, but I don't really want crouching and hiding to be part of the loop. I'm thinking of doing HP as "luck", where shots that "hit" you while you have luck left actually just missed, but standing in front of several enemies in the open will be punished.
I copy pasted that owlman inside the wall and I laughed when I noticed him, he's a good kid.
PS Broadside Renegades difficulty/onboarding phase 1 is on the beta branch, with easy difficulty, new Left/Right indicators and new ending cutscenes
#game dev#broadside renegades#game development#indie dev#game design#pixel art#indie games#gamedev#prototype#top down#optimization#video games
9 notes
·
View notes
Quote
What will happen once the authentic mass man takes over, we do not know yet, although it may be a fair guess that he will have more in common with the meticulous, calculated correctness of Himmler than with the hysterical fanaticism of Hitler, will more resemble the stubborn dullness of Molotov than the sensual vindictive cruelty of Stalin.
Hannah Arendt, The Origins of Totalitarianism
#philosophy#quotes#Hannah Arendt#The Origins of Totalitarianism#instrumentalism#mechanism#optimization#control#authority#capitalism#totalitarianism#authoritarianism
90 notes
·
View notes
Text
Me, talking about D&D: So, I really want to play a mix of gunk and BM but my party already has a padlock, a shepherd, a gravy built around magic bones. and one of those new goos. I'm worried that I'm going to feel like I'm not enough of a munchkin. Like, maybe my dungeon master might to decide to play battleship, but then if we're not doing it raw on the table like the crawfish intended then power word tiptoe probably isn't going to do anything.
My therapist, pouring gasoline all over her office: Well, you could just go beast mode.
Me: Yeah, but I don't think I'm allowed to have infinite tail armor. That just feels like the kind of punpun shit that makes rocks fall...
19 notes
·
View notes
Text
youtube
8 notes
·
View notes
Text
The most productive way to do something is the one you finish
Optimizing your tasks to take the least amount of time does nothing if that "optimized" version is the one you never want to actually do or finish. If your way of doing something takes 5x longer than some other way, but the "longer" way is the only way you can or want to do it, that is the most productive way for you and don't allow yourself to be shamed for that.
29 notes
·
View notes
Quote
If the goal is the sole point of orientation, then the spatial interval to be crossed before reaching it is simply an obstacle to be overcome as quickly as possible. Pure orientation towards the goal deprives the in-between space of all meaning, emptying it to become a corridor without any value of its own. Acceleration is the attempt to make the temporal interval that is needed for bridging the spatial interval disappear altogether. The rich meaning of the path disappears. Acceleration leads to a semantic impoverishment of the world. Space and time no longer mean very much.
Byung-chul Han, The Scent of Time
#goals#objectives#instrumentality#mechanism#optimization#meaning#time#space#quotes#Han#Byung-chul Han#The Scent of Time
113 notes
·
View notes
Link
#biology#CFD#computational fluid dynamics#fish#fluid dynamics#optimization#physics#science#swimming#undulatory swimming
65 notes
·
View notes
Text
Spent an hour today joining all my incredibly dense world chunk geometry and exporting it as individual objects. It was really boring.
Nothing functionally has changed and all the geo still looks the same BUT because of the sheer number of individual models I was using now rendering the entire scene only takes ~4ms instead of 16ms as it was previously. Everything is super smooth now!
Wishlist Centauri Dark || Join My Discord
#gamedev#game development#game dev#indie games#indie game#gamedevelopment#indiegames#indie dev#indiedev#centauri dark#codeblr#optimization#game optimization
12 notes
·
View notes
Note
These references are out of date so I hope you'll bear with me. Why is it games like Watchdogs 2 can have a whole citty full of NPC's each with mostly unique profiles and interactable (you can hack almost everyone, you can physically interact, etc). But something like Yandere simulator struggles with keeping a frame rate with not even 200 NPC's. The models in Watchdogs 2 are also more hyper realistic so I don't know if that means more framerate impaction?
What you're seeing are the programming principles of optimization and scalability in effect. These two principles are more than the sum of their parts, they are multiplicative in their effectiveness (or lack thereof). Thus, if there's a situation where we need to optimize at scale, the results are very pronounced. When we talk about performance, it helps to think of it as costs to do things. We spend system resources (CPU time, GPU time, system memory, etc.) to perform tasks (load a dude, draw a dude, animate a dude). Optimization is being clever about not wasting our resources doing unnecessary things. This lowers the cost of performing these tasks. Scalability is the other factor - the number of things there are multiplies their overall costliness. This should make intuitive sense.
Let's have an example - imagine that you need cupcakes for a party. The cupcakes cost $5 each and there's a $20 flat delivery fee. We need five cupcakes for a party, so the cost is $20 (delivery) + $25 (5 cupcakes x $5) for a total of $45. Optimization is our way of reducing the individual costs. We can optimize either the cost of the cupcakes or the cost of the delivery fee. Maybe we can optimize the delivery fee down to $10 but can only optimize the cupcake cost down by $1 each. We only have time to choose one optimization. In this case, optimizing the delivery fee results in a better overall cost reduction - 5 cupcakes x $5 apiece + $10 delivery is $35, while 5 cupcakes x $4 apiece + $20 delivery is $40.
Now think about what happens if the numbers change. Instead of needing five cupcakes for the party, let's say we need a thousand cupcakes. 1,000 cupcakes x $5 apiece + $20 delivery = $5,020. If we optimize the delivery fee, the cost becomes 1,000 cupcakes x $5 + $10 delivery = $5,010. Here, optimizing the cupcake price is a much better deal than optimizing the delivery fee! If we reduce the price per cupcake by $1, we get 1,000 cupcakes x $4 apiece + $20 delivery fee = $4,020.
Bringing this back to games, it should make sense now. Ubisoft spent a lot of engineering time optimizing the cost of each NPC (cupcake) down as much as possible because they knew that they would have a huge number of them in their game world. Yandere Simulator did not spend as much time optimizing their NPCs, so their NPCs are more costly than the WatchDogs NPCs.
[Join us on Discord] and/or [Support us on Patreon]
Got a burning question you want answered?
Short questions: Ask a Game Dev on Twitter
Long questions: Ask a Game Dev on Tumblr
Frequent Questions: The FAQ
24 notes
·
View notes
Text
Optimizing Website Performance: Best Practices for Faster Load Times
In today's era, if your website's performance isn’t Ideal, then it could be one of the most frustrating things that your users could experience. This could become a huge problem if not taken care of — In order to avoid higher bounce rates and lower search engine engine ratings, One must definitely invest in optimizing their website’s performance. By optimizing your website, you can attract more customers to your website thus resulting in higher traffic and more conversion rates.
In this article, we will be exploring different aspects to optimize your website performance. We will be :
Understanding different performance metrics to measure the performance of your website
Discussing best practices & techniques for improving website load times.
Exploring advanced optimization techniques like browser caching, lazy loading and many more.
Exploring different tools that can help you monitor or enhance website performance.
Understanding Website Performance Metrics
1. Website Speed
When you think about "What must be an ideal speed for a website?", most of the time the thing you might only consider is how fast a page loads. But there’s much more to it than that meets the eye. In today's era, people have shorter attention spans than ever and it’s crucial to know how your site performs in different speed-related aspects:
Time to Title :
This is the time taken by a website when a visitor request's your website and the time it takes to load the site’s title on the browser tab. If the site's title appears quickly, then this gives the users a sense of trust and reassures visitors that this site is a legitimate and trustworthy website.
Time to Start Render:
After searching a website The worst experience you could provide to your user is when the website buffers a lot and you can not even see a single bit of content. This measures how long it takes for any content to appear on the screen after a user requests your site. Nobody likes to wait and if your website provides the users what they are searching for within a fraction of seconds then they are more likely to stay.
Time to Interact:
This is the time from when a visitor first requests your site to when they can actually start interacting with it (like clicking on links or scrolling). The quicker they can engage, the more likely they are to stick around.
To start improving your site speed, focus on these three metrics. You can check them using free web page speed test tools.
Key Metrics For Website Speed
2. Number of Assets
“Assets” are the building blocks of your webpage, including text, images, videos, and more. Each of these elements adds to your page’s load time. The more assets you have, the slower your page might load. Tools are available to help you analyze the size of these assets, and if they’re dragging down your load speed, consider hosting them externally.
3. Error Rate
This metric measures the ratio of error produced by your site against normal requests received by your site. If you see a rise in errors, then that could indicate a potential problem that could bring your site down if not addressed. When you keep a check on your error rate, error prevention and correction is done before it escalates into any disruptive events.
4. Bounce Rate
The bounce rate represents the percentage of individuals that visit your website and leave after a very short time span, more precisely – in the first session. A very high bounce rate could be a threat, not only for your conversion rates but also to your SEO as it portrays your site is not offering what the visitors are looking for. The percentage of the visitors that exit after viewing the first page of the site, is known as the bounce rate and this can be discovered with the help of Google Analytics, the steps are quite simple – Open google analytics, under Behavior > Site Content > Landing Pages – Here you can see the entire report of your website. Now, you just have to scroll down inorder to see the bounce rates of individual pages.
How To Check Bounce Rate
5. Unique Visitors
Unique visitors denote the daily, weekly or monthly total number of unique people that access the website from different browsers. This metric is the key to measure the growth of our website. While, there is no doubt that repeated visitors are of major importance too, but an increasing amount of unique visitors indicates that you’re bringing in fresh audiences to your website.
6. Traffic Source
Traffic Sources indicate the medium through which users have come to your website. In this context, the amount of traffic to a website and its origin is important, and knowing where the visitors are coming from is as critical as knowing how much traffic one gets. This also makes it possible to determine whether your visitors are coming from organic searches, via social media, or referrals. Ideally, your traffic should come from different sources – If the traffic is weighted heavily towards any one of the sources, then some form of content strategy revision might be called for to focus on the sources which are bringing maximum traffic to the website.
Fun fact : You can monitor all this in Google Analytics under Acquisition > All Traffic > Channels.
7. Conversion rate
This is also termed as a conversion rate which is defined as how well the site’s visitors are converted into customers or leads generated. In the case of getting huge traffic and low conversion rates, this probably indicates that the conversion strategies of the site could have better yield than they presently achieve. For Google Analytics then, this data can be accessed under Conversions > Overview.
8. Top Pages
Page performance is also important so the pages that are performing better must be taken extra care. These could be pages that have either the highest percentage of conversions, or the highest/latest volume of visitors. Knowing which pages are doing well and why? helps you to make those existing pages of your site to do well based on the research you have of the existing high performing pages. Top pages can be monitored in Google Analytics through Landing Pages and Exit Pages under Behavior > Site Content.
Landing Pages:
These are the pages that users first enter on your website. These pages are frequently labeled as ‘First Impression Pages’ and as such they cannot be taken for granted even for an instance and must be in great shape.
Exit Pages:
These pages are the last pages which a visitor views immediately before leaving the website. As these pages are detrimental to the retention time of visitors, it’s necessary to acknowledge them and improve.
9. Keyword Ranking
Keyword ranking indicates the effectiveness of a particular website for a particular query to the search engines. While a drop in ranking can be scary, as long as the keywords are routinely monitored and updated, you can rest assured that your efforts to improve SEO are not in vain. Also, there are various tools used for monitoring keyword rankings.
Improve Keyword Ranking Fast
10. Average Session Duration
This refers to the average time spent by the visitors on your website during a single session. Longer sessions suggest greater willingness of users to buy your product or use your service. While analyzing this metric, businesses must understand the type of their website – For instance, the duration of a news website might have a shorter average session compared to an e-commerce website, because the readers might quickly go through the articles and move on
Explore the Insights
2 notes
·
View notes