#perspectivecamera
Explore tagged Tumblr posts
arashtadstudio · 3 years ago
Link
0 notes
pedromartins15 · 5 years ago
Photo
Tumblr media
Like I give a shit about what you think... street style rules... In Frame: @natalieegates •••••••••••••••••••••••••••••••••• #pedro_pixel_ @pedro_pixel_ #natularlightphotographer #streetraitz #streetstyle #greatmodels #alternative #alternativemodel #tattoos #piercings #streetclothes #coolhats #niceglasses #colourvsblackandwhite #portraitphotography #portraitvision #canonphotography #canonlens #50mm14 #capitalports #perspectivecamera #portraits4all #humanspt #shooters_pt (em Shoreditch District - London) https://www.instagram.com/p/CAKoRZFhJnp/?igshid=eplyqyye8bar
0 notes
ochiaimikan · 6 years ago
Link
three.jsについて
0 notes
carolinesprojectincc · 4 years ago
Text
TIFUitDTB Part 1: Importing classes
This post is part of a new series I like to call “Things I find useful in the Discover Three.js book”.
“However, in this book we’ll prefer to import only the classes that we need in any given module:
main.js: importing class as we need them    
import {  PerspectiveCamera,  MeshStandardMaterial,  WebGLRenderer, } from './vendor/three/build/three.module.js';
Now instead of hundreds of properties being imported, there are only the three that we need[...]
Doing this forces us to think more carefully about the classes we’re using in a given module, which means we’re more likely to follow best practices and keep our modules small and focused. We can also avoid using the THREE namespace this way. [...]
[About using CDN]
To find a file from the repo, take the URL from GitHub (such as examples/jsm/controls/OrbitControls.js) and prepend https://unpkg.com/three@0.{version}.0, where {version} is the current release of three.js.”
0 notes
reactsharing-blog · 7 years ago
Text
Introducing Expo AR: Three.js on ARKit
The newest release of Expo features an experimental release of the Expo Augmented Reality API for iOS. This enables the creation of AR scenes using just JavaScript with familiar libraries such as three.js, along with React Native for user interfaces and Expo’s native APIs for geolocation and other device features. Check out the below video to see it in action!
youtube
Play the demo on Expo here! You’ll need an ARKit-compatible device to run it. The demo uses three.js for graphics, cannon.js for real-time physics, and Expo’s Audio API to play the music. You can summon the ducks toward you by touching the screen. The sound changes when you go underwater. You can find the source code for the demo on GitHub.
We’ll walk through the creation of a basic AR app from scratch in this blog post. You can find the resulting app on Snack here, where you can edit the code in the browser and see changes immediately using the Expo app on your phone! We’ll keep this app really simple so you can see how easy it is to get AR working, but all of the awesome features of three.js will be available to you to expand your app later. For further control you can use Expo’s OpenGL API directly to perform your own custom rendering.
Making a basic three.js app
First let’s make a three.js app that doesn’t use AR. Create a new Expo project with the “Blank” template (check out the Up and Running guide in the Expo documentation if you haven’t done this before — it suggests the “Tab Navigation” template but we’ll go with the “Blank” one). Make sure you can open it with Expo on your phone, it should look like this:
Update to the latest version of the expo library:
npm update expo
Make sure its version is at least 21.0.2.
Now let’s add the expo-three and three.js libraries to our projects. Run the following command to install them from npm:
npm i -S three expo-three
Import them at the top of App.js as follows:
import * as THREE from 'three'; import ExpoTHREE from 'expo-three';
Now let’s add a full-screenExpo.GLView in which we will render with three.js. First, import Expo:
import Expo from 'expo';
Then replace render() in the App component with:
render() return ( <Expo.GLView style= /> );
This should make the app turn into just a white screen. That’s what an Expo.GLView shows by default. Let’s get some three.js in there! First, let’s add anonContextCreate callback for the Expo.GLView, this is where we receive a gl object to do graphics stuff with:
render() return ( <Expo.GLView style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => // Do graphics stuff here!
We’ll mirror the introductory three.js tutorial, except using modern JavaScript and using expo-three’s utility function to create an Expo-compatible three.js renderer. That tutorial explains the meaning of each three.js concept we’ll use pretty well. So you can use the code from here and read the text there! All of the following code will go into the _onGLContextCreate function.
First we’ll add a scene, a camera and a renderer:
const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera( 75, gl.drawingBufferWidth / gl.drawingBufferHeight, 0.1, 1000);
const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
This code reflects that in the three.js tutorial, except we don’t use window (a web concept) to get the size of the renderer, and we also don’t need to attach the renderer to any document (again a web concept) — the Expo.GLView is already attached!
The next step is the same as from the three.js tutorial:
const geometry = new THREE.BoxGeometry(1, 1, 1); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); scene.add(cube);
camera.position.z = 5;
Now let’s write the main loop. This is again the same as from the three.js tutorial, except Expo.GLView’s gl object requires you to signal the end of a frame explicitly:
const animate = () => requestAnimationFrame(animate); renderer.render(scene, camera); gl.endFrameEXP(); animate();
Now if you refresh your app you should see a green square, which is just what a green cube would look like from one side:
Oh and also you may have noticed the warnings. We don’t need those extensions for this app (and for most apps), but they still pop up, so let’s disable the yellow warning boxes for now. Put this after your import statements at the top of the file:
console.disableYellowBox = true;
Let’s get that cube dancing! In the animate function we wrote earlier, before the renderer.render(…) line, add the following:
cube.rotation.x += 0.07; cube.rotation.y += 0.04;
This just updates the rotation of the cube every frame, resulting in a rotating cube:
You have a basic three.js app ready now… So let’s put it in AR!
Adding AR
First, we create an AR session for the lifetime of the app. Expo needs to access your devices camera and other hardware facilities to provide AR tracking information and the live video feed, and it tracks the state of this access using the AR session. An AR session is created with the Expo.GLView.startARSession() function. Let’s do this in _onGLContextCreate before all our three.js code. Since we need to call this on our Expo.GLView, we save a ref to it and make the call:
render() return ( <Expo.GLView ref=(ref) => this._glView = ref style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => { const arSession = await this._glView.startARSessionAsync();
...
Now we are ready to show the live background behind the 3d scene! ExpoTHREE.createARBackgroundTexture(arSession, renderer) is an expo-three utility that returns a THREE.Texture that live-updates to display the current AR background frame. We can just set the scene’s .background to it and it’ll display behind our scene. Since this needs the renderer to be passed in, we added it after the renderer creation line:
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
This should give you the live camera feed behind the cube:
You may notice though that this doesn’t quite get us to AR yet: we have the live background behind the cube but the cube still isn’t being positioned to reflect the device’s position in real life. We have one little step left: using an expo-three AR camera instead of three.js’s default PerspectiveCamera. We’ll use the ExpoTHREE.createARCamera(arSession, width, height, near, far) to create the camera instead of what we already have. Replace the current camera initialization with:
const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 );
Currently we position the cube at (0, 0, 0) and move the camera back to see it. Instead, we’ll keep the camera position unaffected (since it is now updated in real-time to reflect AR tracking) and move the cube forward a bit. Remove the camera.position.z = 5; line and add cube.position.z = -0.4; after the cube creation (but before the main loop). Also, let’s scale the cube down to a side of 0.07 to reflect the AR coordinate space’s units. In the end, your code from scene creation to cube creation should now look as follows:
const scene = new THREE.Scene(); const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 ); const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
const geometry = new THREE.BoxGeometry(0.07, 0.07, 0.07); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); cube.position.z = -0.4; scene.add(cube);
This should give you a cube suspended in AR:
And there you go: you now have a basic Expo AR app ready!
Links
To recap, here are some links to useful resources from this tutorial:
The basic AR app on Snack, where you can edit it in the browser
The source code for the “Dire Dire Ducks!” demo on Github
The Expo page for the “Dire Dire Ducks!” demo, lets you play it on Expo immediately
Have fun and hope you make cool things! 🙂
Had to include this picture to get a cool thumbnail for the blog post…
Introducing Expo AR: Three.js on ARKit was originally published in Exposition on Medium, where people are continuing the conversation by highlighting and responding to this story.
via React Native on Medium http://ift.tt/2xmNejL
Via https://reactsharing.com/introducing-expo-ar-three-js-on-arkit-4.html
0 notes
askanything-online-blog · 8 years ago
Text
Limit camera rotation on the Y axis
Limit camera rotation on the Y axis
I am using JSModeler to display OBJ files. It internally uses THREE.JS and creates a PerspectiveCamera. What I need is to limit the movement of the camera on the Y axis so not to go underneath the object. I know how to do this with THREE.OrbitControls but this doesn’t work with JSModeler. Is there a way to directly control the camera movement? Thanks.
Source: stackoverflow-javascript
View On WordPress
0 notes
mbaljeetsingh · 8 years ago
Text
Create 3D Web Apps & Graphics with Whitestorm.js
Web games have come a long way thanks to WebGL and related HTML5 APIs. The most prominent open source library for 3D is Three.js.
While Three.js is powerful, it’s also complex to learn from scratch. Instead, you can pick up Whitestorm.js, an open source framework made for 3D web graphics. It uses Three.js as an underlying technology to help you build faster and create realistic 3D effects in the browser.
Whitestorm comes with its own physics engine built on top of Three.js rendering. This lets you create realistic gravity and other similar effects where objects interact & respond to each other.
And Whitestorm is completely modular, so you have full control over which features get loaded into the page. It uses the Bullet Physics library ported into JavaScript for full support on the web.
Here’s a basic snippet of code that creates a new Three.js environment using Whitestorm.
const app = new WHS.App([ new WHS.app.ElementModule(), // attach to DOM new WHS.app.SceneModule(), // creates THREE.Scene instance new WHS.app.CameraModule(), // creates PerspectiveCamera instance new WHS.app.RenderingModule() // creates WebGLRenderer instance ]); app.start(); // run animation
You can naturally add your own modules and even create plugins/components built off the default library. The JS code supports ECMAScript 6 and should support all upcoming changes to the language.
Geometry, physics, and motion all rolled into one library. Whitestorm really is the future of 3D animation for the web.
You can find lots of sample codes in the GitHub repo along with download links and a file browser. Be warned the library is huge so there’s a lot to go through. Even the documentation has lengthy tutorials for beginners.
But with those docs, you can learn everything from 3D transforms to debugging and detailed 3D animation.
To learn more, visit the main site and browse through some live examples to see Whitestorm in action. If you’re daring enough to dive in, then download a copy of the library from GitHub or via npm and start creating some sweet 3D web apps.
Recommended Reading:20 Useful 3D-Modeling Software You Can Use For Free
via Hongkiat http://ift.tt/2qafnbl
0 notes
pedromartins15 · 5 years ago
Photo
Tumblr media
Station Tunnel in London city Stay with amazing: @karennijsen •••••••••••••••••••••••••••••••••• #pedro_pixel_ @pedro_pixel_ #naturallightphotographer #naturallightphotography #station #tunnel #kingscrossstation #blackandwhitephotography #blackandwhitephoto #greatmodel #asianbeauty #mixedrace #with #doutch #longhair #longstraighthair #asianmodel #portraitphotography #portraitvision #london #capitalports #portraits4all #perspectivecamera #canon70d #canonphotography #50mm #59mm14 (em King's Cross) https://www.instagram.com/p/CApN1EOhYXG/?igshid=135ukqw7v8qd6
0 notes
reactsharing-blog · 7 years ago
Text
Introducing Expo AR: Three.js on ARKit
The newest release of Expo features an experimental release of the Expo Augmented Reality API for iOS. This enables the creation of AR scenes using just JavaScript with familiar libraries such as three.js, along with React Native for user interfaces and Expo’s native APIs for geolocation and other device features. Check out the below video to see it in action!
youtube
Play the demo on Expo here! You’ll need an ARKit-compatible device to run it. The demo uses three.js for graphics, cannon.js for real-time physics, and Expo’s Audio API to play the music. You can summon the ducks toward you by touching the screen. The sound changes when you go underwater. You can find the source code for the demo on GitHub.
We’ll walk through the creation of a basic AR app from scratch in this blog post. You can find the resulting app on Snack here, where you can edit the code in the browser and see changes immediately using the Expo app on your phone! We’ll keep this app really simple so you can see how easy it is to get AR working, but all of the awesome features of three.js will be available to you to expand your app later. For further control you can use Expo’s OpenGL API directly to perform your own custom rendering.
Making a basic three.js app
First let’s make a three.js app that doesn’t use AR. Create a new Expo project with the “Blank” template (check out the Up and Running guide in the Expo documentation if you haven’t done this before — it suggests the “Tab Navigation” template but we’ll go with the “Blank” one). Make sure you can open it with Expo on your phone, it should look like this:
Update to the latest version of the expo library:
npm update expo
Make sure its version is at least 21.0.2.
Now let’s add the expo-three and three.js libraries to our projects. Run the following command to install them from npm:
npm i -S three expo-three
Import them at the top of App.js as follows:
import * as THREE from 'three'; import ExpoTHREE from 'expo-three';
Now let’s add a full-screenExpo.GLView in which we will render with three.js. First, import Expo:
import Expo from 'expo';
Then replace render() in the App component with:
render() return ( <Expo.GLView style= /> );
This should make the app turn into just a white screen. That’s what an Expo.GLView shows by default. Let’s get some three.js in there! First, let’s add anonContextCreate callback for the Expo.GLView, this is where we receive a gl object to do graphics stuff with:
render() return ( <Expo.GLView style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => // Do graphics stuff here!
We’ll mirror the introductory three.js tutorial, except using modern JavaScript and using expo-three’s utility function to create an Expo-compatible three.js renderer. That tutorial explains the meaning of each three.js concept we’ll use pretty well. So you can use the code from here and read the text there! All of the following code will go into the _onGLContextCreate function.
First we’ll add a scene, a camera and a renderer:
const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera( 75, gl.drawingBufferWidth / gl.drawingBufferHeight, 0.1, 1000);
const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
This code reflects that in the three.js tutorial, except we don’t use window (a web concept) to get the size of the renderer, and we also don’t need to attach the renderer to any document (again a web concept) — the Expo.GLView is already attached!
The next step is the same as from the three.js tutorial:
const geometry = new THREE.BoxGeometry(1, 1, 1); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); scene.add(cube);
camera.position.z = 5;
Now let’s write the main loop. This is again the same as from the three.js tutorial, except Expo.GLView’s gl object requires you to signal the end of a frame explicitly:
const animate = () => requestAnimationFrame(animate); renderer.render(scene, camera); gl.endFrameEXP(); animate();
Now if you refresh your app you should see a green square, which is just what a green cube would look like from one side:
Oh and also you may have noticed the warnings. We don’t need those extensions for this app (and for most apps), but they still pop up, so let’s disable the yellow warning boxes for now. Put this after your import statements at the top of the file:
console.disableYellowBox = true;
Let’s get that cube dancing! In the animate function we wrote earlier, before the renderer.render(…) line, add the following:
cube.rotation.x += 0.07; cube.rotation.y += 0.04;
This just updates the rotation of the cube every frame, resulting in a rotating cube:
You have a basic three.js app ready now… So let’s put it in AR!
Adding AR
First, we create an AR session for the lifetime of the app. Expo needs to access your devices camera and other hardware facilities to provide AR tracking information and the live video feed, and it tracks the state of this access using the AR session. An AR session is created with the Expo.GLView.startARSession() function. Let’s do this in _onGLContextCreate before all our three.js code. Since we need to call this on our Expo.GLView, we save a ref to it and make the call:
render() return ( <Expo.GLView ref=(ref) => this._glView = ref style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => { const arSession = await this._glView.startARSessionAsync();
...
Now we are ready to show the live background behind the 3d scene! ExpoTHREE.createARBackgroundTexture(arSession, renderer) is an expo-three utility that returns a THREE.Texture that live-updates to display the current AR background frame. We can just set the scene’s .background to it and it’ll display behind our scene. Since this needs the renderer to be passed in, we added it after the renderer creation line:
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
This should give you the live camera feed behind the cube:
You may notice though that this doesn’t quite get us to AR yet: we have the live background behind the cube but the cube still isn’t being positioned to reflect the device’s position in real life. We have one little step left: using an expo-three AR camera instead of three.js’s default PerspectiveCamera. We’ll use the ExpoTHREE.createARCamera(arSession, width, height, near, far) to create the camera instead of what we already have. Replace the current camera initialization with:
const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 );
Currently we position the cube at (0, 0, 0) and move the camera back to see it. Instead, we’ll keep the camera position unaffected (since it is now updated in real-time to reflect AR tracking) and move the cube forward a bit. Remove the camera.position.z = 5; line and add cube.position.z = -0.4; after the cube creation (but before the main loop). Also, let’s scale the cube down to a side of 0.07 to reflect the AR coordinate space’s units. In the end, your code from scene creation to cube creation should now look as follows:
const scene = new THREE.Scene(); const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 ); const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
const geometry = new THREE.BoxGeometry(0.07, 0.07, 0.07); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); cube.position.z = -0.4; scene.add(cube);
This should give you a cube suspended in AR:
And there you go: you now have a basic Expo AR app ready!
Links
To recap, here are some links to useful resources from this tutorial:
The basic AR app on Snack, where you can edit it in the browser
The source code for the “Dire Dire Ducks!” demo on Github
The Expo page for the “Dire Dire Ducks!” demo, lets you play it on Expo immediately
Have fun and hope you make cool things! 🙂
Had to include this picture to get a cool thumbnail for the blog post…
Introducing Expo AR: Three.js on ARKit was originally published in Exposition on Medium, where people are continuing the conversation by highlighting and responding to this story.
via React Native on Medium http://ift.tt/2xmNejL
Via https://reactsharing.com/introducing-expo-ar-three-js-on-arkit-3.html
0 notes
reactsharing-blog · 7 years ago
Text
Introducing Expo AR: Three.js on ARKit
The newest release of Expo features an experimental release of the Expo Augmented Reality API for iOS. This enables the creation of AR scenes using just JavaScript with familiar libraries such as three.js, along with React Native for user interfaces and Expo’s native APIs for geolocation and other device features. Check out the below video to see it in action!
youtube
Play the demo on Expo here! You’ll need an ARKit-compatible device to run it. The demo uses three.js for graphics, cannon.js for real-time physics, and Expo’s Audio API to play the music. You can summon the ducks toward you by touching the screen. The sound changes when you go underwater. You can find the source code for the demo on GitHub.
We’ll walk through the creation of a basic AR app from scratch in this blog post. You can find the resulting app on Snack here, where you can edit the code in the browser and see changes immediately using the Expo app on your phone! We’ll keep this app really simple so you can see how easy it is to get AR working, but all of the awesome features of three.js will be available to you to expand your app later. For further control you can use Expo’s OpenGL API directly to perform your own custom rendering.
Making a basic three.js app
First let’s make a three.js app that doesn’t use AR. Create a new Expo project with the “Blank” template (check out the Up and Running guide in the Expo documentation if you haven’t done this before — it suggests the “Tab Navigation” template but we’ll go with the “Blank” one). Make sure you can open it with Expo on your phone, it should look like this:
Update to the latest version of the expo library:
npm update expo
Make sure its version is at least 21.0.2.
Now let’s add the expo-three and three.js libraries to our projects. Run the following command to install them from npm:
npm i -S three expo-three
Import them at the top of App.js as follows:
import * as THREE from 'three'; import ExpoTHREE from 'expo-three';
Now let’s add a full-screenExpo.GLView in which we will render with three.js. First, import Expo:
import Expo from 'expo';
Then replace render() in the App component with:
render() return ( <Expo.GLView style= /> );
This should make the app turn into just a white screen. That’s what an Expo.GLView shows by default. Let’s get some three.js in there! First, let’s add anonContextCreate callback for the Expo.GLView, this is where we receive a gl object to do graphics stuff with:
render() return ( <Expo.GLView style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => // Do graphics stuff here!
We’ll mirror the introductory three.js tutorial, except using modern JavaScript and using expo-three’s utility function to create an Expo-compatible three.js renderer. That tutorial explains the meaning of each three.js concept we’ll use pretty well. So you can use the code from here and read the text there! All of the following code will go into the _onGLContextCreate function.
First we’ll add a scene, a camera and a renderer:
const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera( 75, gl.drawingBufferWidth / gl.drawingBufferHeight, 0.1, 1000);
const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
This code reflects that in the three.js tutorial, except we don’t use window (a web concept) to get the size of the renderer, and we also don’t need to attach the renderer to any document (again a web concept) — the Expo.GLView is already attached!
The next step is the same as from the three.js tutorial:
const geometry = new THREE.BoxGeometry(1, 1, 1); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); scene.add(cube);
camera.position.z = 5;
Now let’s write the main loop. This is again the same as from the three.js tutorial, except Expo.GLView’s gl object requires you to signal the end of a frame explicitly:
const animate = () => requestAnimationFrame(animate); renderer.render(scene, camera); gl.endFrameEXP(); animate();
Now if you refresh your app you should see a green square, which is just what a green cube would look like from one side:
Oh and also you may have noticed the warnings. We don’t need those extensions for this app (and for most apps), but they still pop up, so let’s disable the yellow warning boxes for now. Put this after your import statements at the top of the file:
console.disableYellowBox = true;
Let’s get that cube dancing! In the animate function we wrote earlier, before the renderer.render(…) line, add the following:
cube.rotation.x += 0.07; cube.rotation.y += 0.04;
This just updates the rotation of the cube every frame, resulting in a rotating cube:
You have a basic three.js app ready now… So let’s put it in AR!
Adding AR
First, we create an AR session for the lifetime of the app. Expo needs to access your devices camera and other hardware facilities to provide AR tracking information and the live video feed, and it tracks the state of this access using the AR session. An AR session is created with the Expo.GLView.startARSession() function. Let’s do this in _onGLContextCreate before all our three.js code. Since we need to call this on our Expo.GLView, we save a ref to it and make the call:
render() return ( <Expo.GLView ref=(ref) => this._glView = ref style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => { const arSession = await this._glView.startARSessionAsync();
...
Now we are ready to show the live background behind the 3d scene! ExpoTHREE.createARBackgroundTexture(arSession, renderer) is an expo-three utility that returns a THREE.Texture that live-updates to display the current AR background frame. We can just set the scene’s .background to it and it’ll display behind our scene. Since this needs the renderer to be passed in, we added it after the renderer creation line:
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
This should give you the live camera feed behind the cube:
You may notice though that this doesn’t quite get us to AR yet: we have the live background behind the cube but the cube still isn’t being positioned to reflect the device’s position in real life. We have one little step left: using an expo-three AR camera instead of three.js’s default PerspectiveCamera. We’ll use the ExpoTHREE.createARCamera(arSession, width, height, near, far) to create the camera instead of what we already have. Replace the current camera initialization with:
const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 );
Currently we position the cube at (0, 0, 0) and move the camera back to see it. Instead, we’ll keep the camera position unaffected (since it is now updated in real-time to reflect AR tracking) and move the cube forward a bit. Remove the camera.position.z = 5; line and add cube.position.z = -0.4; after the cube creation (but before the main loop). Also, let’s scale the cube down to a side of 0.07 to reflect the AR coordinate space’s units. In the end, your code from scene creation to cube creation should now look as follows:
const scene = new THREE.Scene(); const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 ); const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
const geometry = new THREE.BoxGeometry(0.07, 0.07, 0.07); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); cube.position.z = -0.4; scene.add(cube);
This should give you a cube suspended in AR:
And there you go: you now have a basic Expo AR app ready!
Links
To recap, here are some links to useful resources from this tutorial:
The basic AR app on Snack, where you can edit it in the browser
The source code for the “Dire Dire Ducks!” demo on Github
The Expo page for the “Dire Dire Ducks!” demo, lets you play it on Expo immediately
Have fun and hope you make cool things! 🙂
Had to include this picture to get a cool thumbnail for the blog post…
Introducing Expo AR: Three.js on ARKit was originally published in Exposition on Medium, where people are continuing the conversation by highlighting and responding to this story.
via React Native on Medium http://ift.tt/2xmNejL
Via https://reactsharing.com/introducing-expo-ar-three-js-on-arkit-2.html
0 notes
reactsharing-blog · 7 years ago
Text
Introducing Expo AR: Three.js on ARKit
The newest release of Expo features an experimental release of the Expo Augmented Reality API for iOS. This enables the creation of AR scenes using just JavaScript with familiar libraries such as three.js, along with React Native for user interfaces and Expo’s native APIs for geolocation and other device features. Check out the below video to see it in action!
youtube
Play the demo on Expo here! You’ll need an ARKit-compatible device to run it. The demo uses three.js for graphics, cannon.js for real-time physics, and Expo’s Audio API to play the music. You can summon the ducks toward you by touching the screen. The sound changes when you go underwater. You can find the source code for the demo on GitHub.
We’ll walk through the creation of a basic AR app from scratch in this blog post. You can find the resulting app on Snack here, where you can edit the code in the browser and see changes immediately using the Expo app on your phone! We’ll keep this app really simple so you can see how easy it is to get AR working, but all of the awesome features of three.js will be available to you to expand your app later. For further control you can use Expo’s OpenGL API directly to perform your own custom rendering.
Making a basic three.js app
First let’s make a three.js app that doesn’t use AR. Create a new Expo project with the “Blank” template (check out the Up and Running guide in the Expo documentation if you haven’t done this before — it suggests the “Tab Navigation” template but we’ll go with the “Blank” one). Make sure you can open it with Expo on your phone, it should look like this:
Update to the latest version of the expo library:
npm update expo
Make sure its version is at least 21.0.2.
Now let’s add the expo-three and three.js libraries to our projects. Run the following command to install them from npm:
npm i -S three expo-three
Import them at the top of App.js as follows:
import * as THREE from 'three'; import ExpoTHREE from 'expo-three';
Now let’s add a full-screenExpo.GLView in which we will render with three.js. First, import Expo:
import Expo from 'expo';
Then replace render() in the App component with:
render() return ( <Expo.GLView style= /> );
This should make the app turn into just a white screen. That’s what an Expo.GLView shows by default. Let’s get some three.js in there! First, let’s add anonContextCreate callback for the Expo.GLView, this is where we receive a gl object to do graphics stuff with:
render() return ( <Expo.GLView style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => // Do graphics stuff here!
We’ll mirror the introductory three.js tutorial, except using modern JavaScript and using expo-three’s utility function to create an Expo-compatible three.js renderer. That tutorial explains the meaning of each three.js concept we’ll use pretty well. So you can use the code from here and read the text there! All of the following code will go into the _onGLContextCreate function.
First we’ll add a scene, a camera and a renderer:
const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera( 75, gl.drawingBufferWidth / gl.drawingBufferHeight, 0.1, 1000);
const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
This code reflects that in the three.js tutorial, except we don’t use window (a web concept) to get the size of the renderer, and we also don’t need to attach the renderer to any document (again a web concept) — the Expo.GLView is already attached!
The next step is the same as from the three.js tutorial:
const geometry = new THREE.BoxGeometry(1, 1, 1); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); scene.add(cube);
camera.position.z = 5;
Now let’s write the main loop. This is again the same as from the three.js tutorial, except Expo.GLView’s gl object requires you to signal the end of a frame explicitly:
const animate = () => requestAnimationFrame(animate); renderer.render(scene, camera); gl.endFrameEXP(); animate();
Now if you refresh your app you should see a green square, which is just what a green cube would look like from one side:
Oh and also you may have noticed the warnings. We don’t need those extensions for this app (and for most apps), but they still pop up, so let’s disable the yellow warning boxes for now. Put this after your import statements at the top of the file:
console.disableYellowBox = true;
Let’s get that cube dancing! In the animate function we wrote earlier, before the renderer.render(…) line, add the following:
cube.rotation.x += 0.07; cube.rotation.y += 0.04;
This just updates the rotation of the cube every frame, resulting in a rotating cube:
You have a basic three.js app ready now… So let’s put it in AR!
Adding AR
First, we create an AR session for the lifetime of the app. Expo needs to access your devices camera and other hardware facilities to provide AR tracking information and the live video feed, and it tracks the state of this access using the AR session. An AR session is created with the Expo.GLView.startARSession() function. Let’s do this in _onGLContextCreate before all our three.js code. Since we need to call this on our Expo.GLView, we save a ref to it and make the call:
render() return ( <Expo.GLView ref=(ref) => this._glView = ref style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => { const arSession = await this._glView.startARSessionAsync();
...
Now we are ready to show the live background behind the 3d scene! ExpoTHREE.createARBackgroundTexture(arSession, renderer) is an expo-three utility that returns a THREE.Texture that live-updates to display the current AR background frame. We can just set the scene’s .background to it and it’ll display behind our scene. Since this needs the renderer to be passed in, we added it after the renderer creation line:
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
This should give you the live camera feed behind the cube:
You may notice though that this doesn’t quite get us to AR yet: we have the live background behind the cube but the cube still isn’t being positioned to reflect the device’s position in real life. We have one little step left: using an expo-three AR camera instead of three.js’s default PerspectiveCamera. We’ll use the ExpoTHREE.createARCamera(arSession, width, height, near, far) to create the camera instead of what we already have. Replace the current camera initialization with:
const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 );
Currently we position the cube at (0, 0, 0) and move the camera back to see it. Instead, we’ll keep the camera position unaffected (since it is now updated in real-time to reflect AR tracking) and move the cube forward a bit. Remove the camera.position.z = 5; line and add cube.position.z = -0.4; after the cube creation (but before the main loop). Also, let’s scale the cube down to a side of 0.07 to reflect the AR coordinate space’s units. In the end, your code from scene creation to cube creation should now look as follows:
const scene = new THREE.Scene(); const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 ); const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
const geometry = new THREE.BoxGeometry(0.07, 0.07, 0.07); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); cube.position.z = -0.4; scene.add(cube);
This should give you a cube suspended in AR:
And there you go: you now have a basic Expo AR app ready!
Links
To recap, here are some links to useful resources from this tutorial:
The basic AR app on Snack, where you can edit it in the browser
The source code for the “Dire Dire Ducks!” demo on Github
The Expo page for the “Dire Dire Ducks!” demo, lets you play it on Expo immediately
Have fun and hope you make cool things! 🙂
Had to include this picture to get a cool thumbnail for the blog post…
Introducing Expo AR: Three.js on ARKit was originally published in Exposition on Medium, where people are continuing the conversation by highlighting and responding to this story.
via React Native on Medium http://ift.tt/2xmNejL
Via https://reactsharing.com/introducing-expo-ar-three-js-on-arkit-2.html
0 notes
reactsharing-blog · 7 years ago
Text
Introducing Expo AR: Three.js on ARKit
The newest release of Expo features an experimental release of the Expo Augmented Reality API for iOS. This enables the creation of AR scenes using just JavaScript with familiar libraries such as three.js, along with React Native for user interfaces and Expo’s native APIs for geolocation and other device features. Check out the below video to see it in action!
youtube
Play the demo on Expo here! You’ll need an ARKit-compatible device to run it. The demo uses three.js for graphics, cannon.js for real-time physics, and Expo’s Audio API to play the music. You can summon the ducks toward you by touching the screen. The sound changes when you go underwater. You can find the source code for the demo on GitHub.
We’ll walk through the creation of a basic AR app from scratch in this blog post. You can find the resulting app on Snack here, where you can edit the code in the browser and see changes immediately using the Expo app on your phone! We’ll keep this app really simple so you can see how easy it is to get AR working, but all of the awesome features of three.js will be available to you to expand your app later. For further control you can use Expo’s OpenGL API directly to perform your own custom rendering.
Making a basic three.js app
First let’s make a three.js app that doesn’t use AR. Create a new Expo project with the “Blank” template (check out the Up and Running guide in the Expo documentation if you haven’t done this before — it suggests the “Tab Navigation” template but we’ll go with the “Blank” one). Make sure you can open it with Expo on your phone, it should look like this:
Update to the latest version of the expo library:
npm update expo
Make sure its version is at least 21.0.2.
Now let’s add the expo-three and three.js libraries to our projects. Run the following command to install them from npm:
npm i -S three expo-three
Import them at the top of App.js as follows:
import * as THREE from 'three'; import ExpoTHREE from 'expo-three';
Now let’s add a full-screenExpo.GLView in which we will render with three.js. First, import Expo:
import Expo from 'expo';
Then replace render() in the App component with:
render() return ( <Expo.GLView style= /> );
This should make the app turn into just a white screen. That’s what an Expo.GLView shows by default. Let’s get some three.js in there! First, let’s add anonContextCreate callback for the Expo.GLView, this is where we receive a gl object to do graphics stuff with:
render() return ( <Expo.GLView style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => // Do graphics stuff here!
We’ll mirror the introductory three.js tutorial, except using modern JavaScript and using expo-three’s utility function to create an Expo-compatible three.js renderer. That tutorial explains the meaning of each three.js concept we’ll use pretty well. So you can use the code from here and read the text there! All of the following code will go into the _onGLContextCreate function.
First we’ll add a scene, a camera and a renderer:
const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera( 75, gl.drawingBufferWidth / gl.drawingBufferHeight, 0.1, 1000);
const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
This code reflects that in the three.js tutorial, except we don’t use window (a web concept) to get the size of the renderer, and we also don’t need to attach the renderer to any document (again a web concept) — the Expo.GLView is already attached!
The next step is the same as from the three.js tutorial:
const geometry = new THREE.BoxGeometry(1, 1, 1); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); scene.add(cube);
camera.position.z = 5;
Now let’s write the main loop. This is again the same as from the three.js tutorial, except Expo.GLView’s gl object requires you to signal the end of a frame explicitly:
const animate = () => requestAnimationFrame(animate); renderer.render(scene, camera); gl.endFrameEXP(); animate();
Now if you refresh your app you should see a green square, which is just what a green cube would look like from one side:
Oh and also you may have noticed the warnings. We don’t need those extensions for this app (and for most apps), but they still pop up, so let’s disable the yellow warning boxes for now. Put this after your import statements at the top of the file:
console.disableYellowBox = true;
Let’s get that cube dancing! In the animate function we wrote earlier, before the renderer.render(…) line, add the following:
cube.rotation.x += 0.07; cube.rotation.y += 0.04;
This just updates the rotation of the cube every frame, resulting in a rotating cube:
You have a basic three.js app ready now… So let’s put it in AR!
Adding AR
First, we create an AR session for the lifetime of the app. Expo needs to access your devices camera and other hardware facilities to provide AR tracking information and the live video feed, and it tracks the state of this access using the AR session. An AR session is created with the Expo.GLView.startARSession() function. Let’s do this in _onGLContextCreate before all our three.js code. Since we need to call this on our Expo.GLView, we save a ref to it and make the call:
render() return ( <Expo.GLView ref=(ref) => this._glView = ref style= onContextCreate=this._onGLContextCreate /> );
_onGLContextCreate = async (gl) => { const arSession = await this._glView.startARSessionAsync();
...
Now we are ready to show the live background behind the 3d scene! ExpoTHREE.createARBackgroundTexture(arSession, renderer) is an expo-three utility that returns a THREE.Texture that live-updates to display the current AR background frame. We can just set the scene’s .background to it and it’ll display behind our scene. Since this needs the renderer to be passed in, we added it after the renderer creation line:
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
This should give you the live camera feed behind the cube:
You may notice though that this doesn’t quite get us to AR yet: we have the live background behind the cube but the cube still isn’t being positioned to reflect the device’s position in real life. We have one little step left: using an expo-three AR camera instead of three.js’s default PerspectiveCamera. We’ll use the ExpoTHREE.createARCamera(arSession, width, height, near, far) to create the camera instead of what we already have. Replace the current camera initialization with:
const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 );
Currently we position the cube at (0, 0, 0) and move the camera back to see it. Instead, we’ll keep the camera position unaffected (since it is now updated in real-time to reflect AR tracking) and move the cube forward a bit. Remove the camera.position.z = 5; line and add cube.position.z = -0.4; after the cube creation (but before the main loop). Also, let’s scale the cube down to a side of 0.07 to reflect the AR coordinate space’s units. In the end, your code from scene creation to cube creation should now look as follows:
const scene = new THREE.Scene(); const camera = ExpoTHREE.createARCamera( arSession, gl.drawingBufferWidth, gl.drawingBufferHeight, 0.01, 1000 ); const renderer = ExpoTHREE.createRenderer( gl ); renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
const geometry = new THREE.BoxGeometry(0.07, 0.07, 0.07); const material = new THREE.MeshBasicMaterial( color: 0x00ff00 ); const cube = new THREE.Mesh(geometry, material); cube.position.z = -0.4; scene.add(cube);
This should give you a cube suspended in AR:
And there you go: you now have a basic Expo AR app ready!
Links
To recap, here are some links to useful resources from this tutorial:
The basic AR app on Snack, where you can edit it in the browser
The source code for the “Dire Dire Ducks!” demo on Github
The Expo page for the “Dire Dire Ducks!” demo, lets you play it on Expo immediately
Have fun and hope you make cool things! 🙂
Had to include this picture to get a cool thumbnail for the blog post…
Introducing Expo AR: Three.js on ARKit was originally published in Exposition on Medium, where people are continuing the conversation by highlighting and responding to this story.
via React Native on Medium http://ift.tt/2xmNejL
Via https://reactsharing.com/introducing-expo-ar-three-js-on-arkit.html
0 notes