$ npm install expo-gl
expo-gl
exports a GLView
class which is a View
that acts as an OpenGL ES render target. On mounting, an OpenGL ES context is created. Its drawing buffer is presented as the contents of the View
every frame.
Firstly, you need to install the package from npm
registry.
npm install expo-gl
or yarn add expo-gl
If you're using Cocoapods, add the dependency to your Podfile
:
pod 'EXGL', path: '../node_modules/expo-gl/ios'
pod 'EXGL-CPP', path: '../node_modules/expo-gl-cpp/cpp'
and run pod install
.
Libraries
➜ Add Files to [your project's name]
.node_modules
➜ expo-gl
➜ ios
and add EXGL.xcodeproj
.node_modules
➜ expo-gl-cpp
➜ ios
➜ EXGL-CPP.xcodeproj
.libEXGL.a
to your project's Build Phases
➜ Link Binary With Libraries
.Cmd+R
).android/settings.gradle
:include ':expo-gl'
project(':expo-gl').projectDir = new File(rootProject.projectDir, '../node_modules/expo-gl/android')
include ':expo-gl-cpp'
project(':expo-gl-cpp').projectDir = new File(rootProject.projectDir, '../node_modules/expo-gl-cpp/android')
and if not already includedinclude ':expo-core'
project(':expo-core').projectDir = new File(rootProject.projectDir, '../node_modules/expo-core/android')
include ':expo-camera-interface'
project(':expo-camera-interface').projectDir = new File(rootProject.projectDir, '../node_modules/expo-camera-interface/android')
include ':expo-file-system-interface'
project(':expo-file-system-interface').projectDir = new File(rootProject.projectDir, '../node_modules/expo-file-system-interface/android')
android/app/build.gradle
:compile project(':expo-gl')
compile project(':expo-gl-cpp')
and if not already includedcompile project(':expo-core')
compile project(':expo-camera-interface')
compile project(':expo-file-system-interface')
import React from 'react';
import { GLView } from 'expo-gl';
export default class ExpoIsAwesome extends React.Component {
onContextCreate = gl => {
// WebGL operations
}
render() {
return (
<GLView onContextCreate={this.onContextCreate} />
);
}
}
Other than the regular View
props for layout and touch handling, the following props are available:
onContextCreate
A function that will be called when the OpenGL ES context is created. The function is passed a single argument gl
that has a WebGLRenderingContext interface.
msaaSamples
GLView
can enable iOS's built-in multisampling. This prop specifies the number of samples to use. By default this is 4. Setting this to 0 turns off multisampling. On Android this is ignored.
takeSnapshotAsync(options)
Same as GLView.takeSnapshotAsync but uses WebGL context that is associated with the view on which the method is called.
GLView.createContextAsync()
Imperative API that creates headless context which is devoid of underlying view. It's useful for headless rendering or in case you want to keep just one context per application and share it between multiple components. It is slightly faster than usual context as it doesn't swap framebuffers and doesn't present them on the canvas, however it may require you to take a snapshot in order to present its results. Also, keep in mind that you need to set up a viewport and create your own framebuffer and texture that you will be drawing to, before you take a snapshot.
A promise that resolves to WebGL context object. See WebGL API for more details.
GLView.destroyContextAsync(gl)
Destroys given context.
A promise that resolves to boolean value that is true
if given context existed and has been destroyed successfully.
GLView.takeSnapshotAsync(gl, options)
Takes a snapshot of the framebuffer and saves it as a file to app's cache directory.
{ x: number, y: number, width: number, height: number }
) -- Rect to crop the snapshot. It's passed directly to glReadPixels
.false
.'jpeg'
or 'png'
. Specifies what type of compression should be used and what is the result file extension. PNG compression is lossless but slower, JPEG is faster but the image has visible artifacts. Defaults to 'jpeg'
.1.0
.Returns { uri, localUri, width, height }
where uri
is a URI to the snapshot. localUri
is a synonym for uri
that makes this object compatible with texImage2D
. width, height
specify the dimensions of the snapshot.
Since the WebGL API is quite low-level, it can be helpful to use higher-level graphics APIs rendering through a GLView
underneath. The following libraries integrate popular graphics APIs:
Any WebGL-supporting library that expects a WebGLRenderingContext could be used. Some times such libraries assume a web JavaScript context (such as assuming document
). Usually this is for resource loading or event handling, with the main rendering logic still only using pure WebGL. So these libraries can usually still be used with a couple workarounds. The Expo-specific integrations above include workarounds for some popular libraries.
Once the component is mounted and the OpenGL ES context has been created, the gl
object received through the onContextCreate
prop becomes the interface to the OpenGL ES context, providing a WebGL API. It resembles a WebGL2RenderingContext in the WebGL 2 spec. However, some older Android devices may not support WebGL2 features. To check whether the device supports WebGL2 it's recommended to use gl instanceof WebGL2RenderingContext
.
An additional method gl.endFrameEXP()
is present which notifies the context that the current frame is ready to be presented. This is similar to a 'swap buffers' API call in other OpenGL platforms.
The following WebGL2RenderingContext methods are currently unimplemented:
getFramebufferAttachmentParameter()
getRenderbufferParameter()
compressedTexImage2D()
compressedTexSubImage2D()
getTexParameter()
getUniform()
getVertexAttrib()
getVertexAttribOffset()
getBufferSubData()
getInternalformatParameter()
renderbufferStorageMultisample()
compressedTexImage3D()
compressedTexSubImage3D()
fenceSync()
isSync()
deleteSync()
clientWaitSync()
waitSync()
getSyncParameter()
getActiveUniformBlockParameter()
The pixels
argument of texImage2D()
must be null
, an ArrayBuffer
with pixel data, or an object of the form { localUri }
where localUri
is the file://
URI of an image in the device's file system. Thus an Expo.Asset
object could be used once .downloadAsync()
has been called on it (and completed) to fetch the resource.
For efficiency reasons the current implementations of the methods don't perform type or bounds checking on their arguments. So, passing invalid arguments could cause a native crash. We plan to update the API to perform argument checking in upcoming SDK versions. Currently the priority for error checking is low since engines generally don't rely on the OpenGL API to perform argument checking and, even otherwise, checks performed by the underlying OpenGL ES implementation are often sufficient.
© 2010 - cnpmjs.org x YWFE | Home | YWFE