#and is not explainable by hardware limitation whatsoever. there are marked improvements in not only using waveform synthesis...
Explore tagged Tumblr posts
deerest-me · 3 months ago
Text
*puts on beret picks up cigarette holder and starts talking about differences between the TTYD remake soundtrack and the original*
#well you see.... ze use of dynamic tracking is a definite enhancement. zis cannot be argued... HOWEVER#the choral vocal synthesizer has disgustingly wide vibrato. this is particularly observable in the final chapter where it causes the#entire sinister atmosphere to become melodramatic / modernday adventure movie esque. too orchestral. truly robs the original of its impact.#and is not explainable by hardware limitation whatsoever. there are marked improvements in not only using waveform synthesis...#e.g. the choral pads that pop up in the new sewers theme. very tasteful. chimes in the thousand year door chamber. shamisen in glitzville.#electric guitar. definitive improvements. additionally bringing modern music sensabilities can be an enhancement: grodus' battle theme#conveys his seriousness much better with an industrial electronic edge to it. plenty of additional examples throughout.#however. comparing the final boss themes there is this ``enshittification'' of videogame remakes that can be seen#the major hollywood adventure movie soundtrack problem aforementioned. play enough remakes and you'll notice it. blending into the same.#the shadow queen is meant to have an air of ``sickliness''. of gloom. kinda her whole deal. lost in the remake in my opinion.#(i am almost tempted to gather audio clips and play them side by side. to an audience of absolutely ZERO)#this is not even mentioning that there are no BEES in the remake's final battle theme. <- the punchline of this entire thing#anyways. this post was inevitable. i have an insane ear for soundtracks and sfx as u may have gathered.
5 notes · View notes
hassank82-blog · 6 years ago
Text
Machine learning beginning to shape data center  infrastructure
The terms AI (artificial intelligence) and ML (machine  learning) have become common technology buzzwords. Simply defined, AI is  applying traits of human intelligence to computers. ML is a subset of AI,  where inputs are mapped to outputs to derive meaningful patterns. Businesses  and analysts alike are touting how transformative AI and ML can be, fueled by  massive amounts of data from the always increasing IoT (internet of things)  ecosystem. While other buzzwords have come and gone without much of a  tangible impact, AI doesn’t seem to fit into that category. AI is reaching  far and wide, from manufacturing floors to supply chain management, and even  to the operation of data centers themselves. Many enterprises are still  planning and piloting different AI applications to understand how it can  transform their businesses, but data centers are already providing an early  use case of a successful AI application.
 Google use case
 Starting in 2015, Google began applying ML in its data centers  to help them operate more efficiently. Devan Adams, Senior Analyst on the IHS  Markit Cloud and Data Center Research Practice outlines some of the practices  and results, “Google’s DeepMind researchers and its DC team began by taking  historical data collected by thousands of sensors within its DCs, including  temperature, power, water pump speeds, and set-points, then analyzed the data  within deep neural networks to focus on lowering PUE (Power Usage  Effectiveness), the ratio of total building energy usage to IT energy usage.  In the end, they achieved up to 40% energy reduction for cooling and 15%  reduction in overall PUE, the lowest the site had ever experienced. Google  plans to roll out its system and share more details about it in an official  publication so other DC and industrial system operators can benefit from its  results.”
 This Google use case is a unique and relatively mature example  of ML in data center operations. Google’s existing cloud infrastructure,  access to large amounts of data and significant in house expertise allowed  for Google to become an early adopter of ML. For enterprises and colocation  data centers operators who lack those benefits, deploying ML in their data  centers may seem like a daunting task, with significant cost and knowledge  barriers to overcome.  However, data center infrastructure suppliers are  stepping in to bring ML integrated cooling, power, and remote management  capabilities to these data centers.
 Data center cooling
 Cooling has become the primary place to start applying ML to  data center infrastructure. The reason for this is cooling consumes around  25% of power a data center uses. Therefore, improving cooling efficiency  translates into serious savings, but this isn’t an easy task. Data centers  are dynamic environments, with changing IT loads, fluctuating internal and  external temperatures, variable fan and pump speeds and different sensor  locations. DCIM (data center infrastructure management) tools have been  helpful in collecting, managing and providing data visualizations, but are  still only tools to inform human operated data center decisions. Even with  all the data one could need, humans are prone to errors. ML brings data  inspired, real time automation to cooling, and the benefits are clear,  according to vXchange colocation provider Chief Marketing and Business  Development Officer, Ernest Sampera, “Data center cooling integrated with  machine learning is a no brainer – it has helped us reduce human errors, and  overall power and costs associated with cooling. Our on-site technicians can  spend more time on our customers instead of being focused on tuning cooling  efficiencies.”
 Power and energy storage
 Power distribution and backup power systems (UPS’) have  limited AI in them today. They currently can utilize firmware to make basic  decisions based on a sensor’s input and pre-programmed, desired outputs.  However, they are not programmed to learn from changing inputs and outputs as  cooling integrated with ML currently is. For UPS’, integrating ML has a  different end goal than cooling. UPS’ integrated with ML will be focused on  preventing downtime, through predicting failures and preventative  maintenance, either self-performed or by alerting engineers of a specific  problem. There is also likely to be some minor efficiency gains in UPS’ based  on ML automation. Where AI and ML is not currently integrated is backup  energy storage . That story, a much more interesting one, beings with  lithium-ion batteries.
 Lithium-ion batteries continue to see growing adoption in data  centers. Prices for lithium-ion batteries continue to decrease and concerns  over safety issues are easing, the argument to keep using VRLA (valve  regulated lead acid) batteries is becoming more difficult. As lithium-ion  batteries grow in data center applications, this opens new possibilities due  to the nature of their chemistry allowing for more frequent charge cycles,  healthy operation below full charge, no coup de fouet effect and no need for  a controlled battery room environment. 
 The combination of lithium-ion battery benefits and ML  capabilities has led to CUI Inc and VPS (Virtual Power Systems) integrating  the two into a new energy storage solution. “Whether a data center is at the  edge, a colocation, or large hyperscale, they all have power infrastructure  optimization issues.” Mark Adams, Senior Vice President of CUI Inc explains.  “With Software Defined Power, CUI Inc and Virtual Power Systems provides a  dynamically intelligent solution to capture currently overprovisioned  infrastructures, and gain greater capitalization of the footprint. Machine  learning algorithms allow for the solution to analyze, predict, and react to  the changes through a combination of hardware energy storage and software.”
 Distributed IT architectures: Edge and  Hybrid data center deployments
 The next generation of data center deployments is coming in a  more distributed approach, often utilizing a combination of cloud and  colocation services, on-premises compute and edge data center deployments.  This will commonly require several “lights-out” data centers, needing to be  managed and operated remotely. ML will play a critical role in the automation  and management of these data centers. Specifically, ML will enable these  “lights-out” data centers to run efficiently, with predicative failure and  preventative maintenance alerts to reduce downtime, while also reducing the  resources required to manage such a footprint.
 Conclusion
 Artificial intelligence and machine learning are touching many  aspects of the data center. They are bringing efficiency gains, increased  reliability and automation to data center physical infrastructure. On top of  that, it will allow for significantly improved remote management of  distributed data center footprints. However, while AI and ML will  revolutionize how a data center is operated, and could even enable fully  autonomous operation of data centers, a human presence in data centers will  remain critical in data center operations over the next decade.
 The human ability to hear, smell, and possess a general  intuition for when something is wrong, is yet to be matched by current AI and  ML capabilities. So, while it may be time to start bringing ML based  infrastructure into the data centers, it’s not time to start shipping the  humans out.
 For more information, please contact:
Lucas Beran
Senior Research Analyst, Cloud and Data Centers
+1 512 813 6290
        IHS Markit Customer Care
Americas: +1 800 IHS CARE (+1 800 447 2273)
Europe, Middle East, and Africa: +44 (0) 1344 328 300
Asia and the Pacific Rim: +604 291 3600
      Disclaimer
The information contained in this report    is confidential. Any unauthorized use, disclosure, reproduction, or    dissemination, in full or in part, in any media or by any means, without    the prior written permission of IHS Markit Ltd. or any of its affiliates ("IHS    Markit") is strictly prohibited. IHS Markit owns all IHS Markit logos    and trade names contained in this report that are subject to license.    Opinions, statements, estimates, and projections in this report (including    other media) are solely those of the individual author(s) at the time of    writing and do not necessarily reflect the opinions of IHS Markit. Neither    IHS Markit nor the author(s) has any obligation to update this report in    the event that any content, opinion, statement, estimate, or projection (collectively,    "information") changes or subsequently becomes inaccurate. IHS    Markit makes no warranty, expressed or implied, as to the accuracy,    completeness, or timeliness of any information in this report, and shall    not in any way be liable to any recipient for any inaccuracies or    omissions. Without limiting the foregoing, IHS Markit shall have no    liability whatsoever to any recipient, whether in contract, in tort    (including negligence), under warranty, under statute or otherwise, in    respect of any loss or damage suffered by any recipient as a result of or    in connection with any information provided, or any course of action    determined, by it or any third party, whether or not based on any    information provided. The inclusion of a link to an external website by IHS    Markit should not be understood to be an endorsement of that website or the    site's owners (or their products/services). IHS Markit is not responsible    for either the content or output of external websites. Copyright © 2018,    IHS Markit™. All rights reserved and all intellectual property rights are    retained by IHS Markit.
0 notes