#Campus Map Ecu
Explore tagged Tumblr posts
gilang-prakasa · 5 years ago
Photo
Tumblr media
Campus Map Ecu Source : maps.ecok.eduCampus Map Ecu parking gif maps ecok edu
0 notes
sensorymapcentral · 3 years ago
Text
Alternative Atlas Project
For my map, I wanted to highlight my disappointment and annoyance of the suburb I live in. Granted, mine is better than most I’ve seen, but still. There’s very little area to actually walk if you want to go anywhere, and most places are more than a 15 - 30 minute walk, where most of your journey you’re next to cars, smelling their fumes. This is especially important for people with disabilities to get around, meaning that the suburb is hostile to them due to having such little mobility and basically requiring a car. (Davis)
The sound map that Davis and Thomas created was one of the more fun maps that were created this semester. The focus was to capture the sound of an area by walking through the space. We wanted to highlight the differences between a lively city soundscape to that of a quieter suburb. The city sounds were full of screeching machinery and lots of people running around everywhere. The combination of all the sounds made it difficult to discern specific sounds. On the other hand, the sounds of the suburb were much softer and each loud sound could be heard clearly. These maps were created through a user experiencing the land first hand. By spending time in a space, we can understand the story of an area beyond just the base knowledge of that area. Although it is known that there is construction going on in the Oakridge area, walking through that area reveals all the sights, sounds, and smells. The sound map revealed only one aspect that can be sensed in those areas, but it demonstrated that it takes time to truly experience a place. (Thomas)
While walking one day I noticed that there are so many different versions of soil, and through something as ‘natural’ as a mixture of organic remains, it raises an important question. How influential and impactful are we towards our environment? Human interaction changes our environment, we have heard this time and time again. But I decided to investigate through mapping dirt samples, a typically ignored part of Earth. I explored within a mile proximity to Emily Carr in various directions and locations, and discovered that where people lived (neighborhoods, houses, etc.) contained less of an actual ground dirt and more of a thick combination of material. Where in parks, and places preserving landscapes provided a more raw and more compact dirt. Perhaps we can gain from my observations that the world cannot process us nearly as fast as it normally does without human interference. This can be said with the consumption of garbage and compost, we combine these systems to destroy and disrupt, when in reality it functions melodically. The dirt map is a small study exploring our impact on our Earth, but stands to show that where more humans are there will be more disruption within even our soil. (Damarra)
This map is composed of clay tiles imprinted with textures found across the ECU Campus. Each tile corresponds with a different area of the ECU campus and texture found in its location. The textures were transferred onto the tile through the process of laying it across a surface and rolling a rubber brayer over top of it to pick up the texture. Each tile was brought to the physical location it represents. The clay used for this project was reclaimed, processed and made in the ECU ceramics studio. Our group chose The Greater Vancouver area as the location of our atlas, as we all live in Vancouver and surrounding areas. However, the ECU campus is a place that connects us all as we all attend school here. This map speaks to the theme of our atlas by exploring a sensory based approach to map making and understanding places beyond a Visual  representation. This map seeks to represent this familiar space through more than just sight. Using the realm of touch, this map invites the viewer to notice details of a familiar space in a new way. (Gus)
0 notes
socs-n-sandals · 5 years ago
Text
Facilities Interview
On October 3, Naomi, Chayann, and I met with Danielle from facilities to talk about our project and some questions that arose from our walk-around mapping. 
Danielle was extremely helpful and knowledgable about how the spaces in the school are booked and looked after. An important detail that came up was the distinction between curriculum and ad hoc space scheduling. We were not aware that there was a difference, or that there were some places reserved solely for curriculum-based work/exhibits. 
We also learned that there is currently no centralized booking system for the entire school and that different stakeholders are involved in different parts of the school - for instance, the entire first and second floor are reserved for exhibitions through the Libby Leshold Gallery.
 A difficult question to answer was the boundaries of the ECU campus, much of the outside areas are controlled by GNW trust and Low Tide properties excluding the Wilson Plaza (ironically) and the grass circle there. There is nothing that marks the border of our campus.
We asked about the time limits for bookings, and it depends on the purpose and location. For critiques, one can reserve for the day, for an independent exhibition there is usually a 3-day maximum, but some exhibitions can be held longer. We asked about more permanent installations, and Danielle told us that these kinds of bookings actually go through the Dean. 
The cafeteria layout turns out to be a joint effort between the student union and Loafe cafe. It is a collaborative effort but there is no sole responsibility of the baseline layout. 
We all spoke about the communication between facilities and students, and we all agreed that there was some difficulty in understanding who to go to, what’s available, and for what reason. Danielle said there is something in the works around a centralized booking system, and that our project may be helpful in the execution of this system.  
Danielle is an extremely valuable resource and expert for our project, and was very happy to help us further on in this project. We really enjoyed talking to her and hope to speak to her again. 
0 notes
eddiejpoplar · 7 years ago
Text
BMW shares with us its autonomous technology roadmap
Earlier this week, BMW invited selected media in the heart of Silicon Valley for a briefing on its roadmap to self-driving cars which are scheduled to arrive in 2021. The exclusive event took place at the BMW Technology Office in Mountain View, California, a research site which works in close collaboration with the innovation group Munich.
Many of the app integrations found currently in new BMWs have started in Mountain View, from the Amazon Alexa and Echo integration to the development of the Apple Watch app. Now, the office focuses mostly on autonomous driving, with lots of talent being pooled from the competitive landscape in the Bay Area. BMW says that one incentive that it offers its prospective employees is the ability to work on products with a clearly defined roadmap.
Dr. Klaus Buettner, the Head of Autonomous Driving at BMW, and Simon Euringer, Head of BMW Group Technology Office USA, hosted the event giving us a deeper look into the autonomous driving philosophy at BMW, one that’s significantly different from other automakers in the space. The BMW executive emphasized the more cautious approach taken by BMW which allows the company to deliver fully functional, no-compromise product when it goes to market.
According to Dr. Buettner, the start of production for BMW’s first autonomous driving car is set in stone for 2021 with the BMW iNEXT car. To get there, BMW outlined their vision of different levels of autonomous driving:
Level 0 – Hands on, Eyes on
Level 1 – Hands on, Eyes on, longitudinal or lateral guidance
Level 2 – Hands temporary off, eyes temporary off, traffic control and longitudinal or lateral guidance
Level 3 – Hands off, eyes off, awareness for take over, take over request
Level 4 – Hands off, mind off, no driver intervention, no take-over request
Level 5 – Hands off, driver off, no driver
The iNext will employ the Level 3 technology, but it won’t be standard on the car. “There is a contradiction in brand values between manual driving and autonomous driving,” Dr. Buettner says. “How to deal with both sides?”
BMW recognizes the need to offer a car that will retain the typical BMW driving dynamics, so therefore, the self-driving technology becomes an add-on, for an additional cost.
Another development project will run in parallel with the iNEXT to test and implement the Level 4 and 5 vision, especially in the challenging urban environment where the need to take over the wheel is far higher.
The first car offering Level 3 autonomous operation will be the Audi A8 which will arrive in early 2018, but the self-driving technology from Ingolstadt comes with a caveat – it works only up t0 37 mph. Dr. Buettner says that BMW wanted to deliver a far more compelling product, hence the more cautious approach which will take the Level 3 to a top speed of 80 mph. The higher speed is more suitable to highway speeds, rather than the typical stop-and-go.
“With Level 3 autonomy, there is a major impact for the whole architecture of the car,” Dr. Buettner says, citing also a technological quantum leap from Level 2 to Level 3. BMW also says that computing power reached a level that allows them to start work on the functional side. In the next few years, sensors will also see a potential industrialization since there is a huge push for more advanced sensors made available for self-driving features.
A fully autonomous driving car needs an impressive array of sensors and tech – 3 LIDARs, full range radar, 2 short range radars, rear facing cameras, GPS antenna, trip focal camera, front side camera, 360 surround view camera, 360 ultra sound coverage, 1 rear camera and 2 rear LIDARs.
The new G30 5 Series which is on the market today has a Level 2 technology with the following features – side view camera, full range radar, ultrasonic sensors, side range radar, surround view camera, stereo front camera and rear view camera.
Level 3 also presents issues on highway at high speeds where the detection of small objects is a critical issue. Equally if not more  challenging are urban environments where pedestrians’ behaviors are an issue.
BMW’s platform for autonomous driving cars is extremely complex, BMW cites some of its pillars – High-performance computing, Artificial Intelligence, computer vision, validation methods, sensors raw data processing, system safety and robustness. Therefore, a deep and extensive collaboration with other automakers and tech companies is needed to achieve the end goal. For example, HD maps are an essential component of autonomous driving, so BMW, Daimler and Audi partnered up to buy HERE Maps. US HD map availability of 255,000 km is expected by end of 2017.
High-data transfers are also important in communication protocols, so joining the 5G Automotive Association was seen as a natural step in Munich.
Different tiers of partnership are also in place. BMW talked about the importance of its partnership with Intel and Mobileye who provide in-car computers powerful enough to drive its future cars and data centers to process the data. Intel has already set one up in Munich. Other OEM/Tier 1 partners include Magna, FCA, Delphi, Continental, who are are involved in development and integration. Currently BMW is in talks to add more partners in 2017 before the development requirements are frozen in 2018.
Speaking of development and engineering, BMW is all about being nimble and iterative, so the company has adopted the Agile software development methodology. To achieve this aim, the BMW Group is combining its development expertise in vehicle connectivity and automated driving at a new campus in Unterschleissheim near Munich. Upon final completion, more than 2,000 employees will work on the next steps towards fully-automated driving, from software development to road testing. Currently 700 to 800 employees are housed in the new building in Munich which encourages collaboration and open-space work environment, similar to what you would see in Silicon Valley startups.
In designing the iNEXT and other future self-driving cars, BMW uses a modular platform which allows, for example, a smaller ECU in some cars, and a larger one where needed. The microprocessors found in a car range from 2 to 7 units, with power consumption rated at 25 kW to 600 kW, which in some cases, might require liquid cooling.
BMW also emphasized safety, likely the most important and most challenging requirement in the adoption and legalization of autonomous driving cars. Dr. Buettner says that 240 million kilometers without accidents have to be completed – 5 percent of this by fleet cars, the rest through computer simulations.
BMW currently has 40 7 Series cars equipped with its self-driving technology testing around Munich, but it eventually aims for 185 test cars on the road. In 2019, there will be 100 vehicles with HAD (Highly Automated Driving) and FAD (Fully Automated Driving) tested worldwide.
Two 7 Series prototypes are currently being tested in the Bay Area, one with Level 3 and the other with tech needed for Levels 4 and 5. Footage of those prototypes can be seen below.
youtube
youtube
Automakers like Mercedes, Volvo, Ford and General Motors are working with ride-hailing companies to incorporate the self-driving technology into their fleets, but BMW says it will announce its partnerships next year. One viable option is to test the Level 3 systems in its DriveNow and ReachNow fleets.
The conclusion of the event was that even though BMW had the capability to deliver Level 3-enhanced cars today, the company prefers to take a more cautious approach in order to achieve the perfect product expected by BMW customers, which also means no beta testing on its drivers.
The article BMW shares with us its autonomous technology roadmap appeared first on BMW BLOG
from Performance Junk Blogger 6 http://ift.tt/2xxpAlI via IFTTT
0 notes