Frequently Asked Questions
Welcome to the Normcore FAQ. If your question is not answered below or in the documentation, please join us on our Discord server to ask us there.
Q: Where do I get Normcore? A: Right here!
Q: How do I get an API / app key?. A: After signing up for a Normcore account, visit the Normcore Dashboard and select “Create a new application”. Enter a name for your app, click ‘Save’, and your app key will be generated and displayed below the name. We’ll send you an email with your new key as well.
Q: I’m unable to add a new application on the Normcore dashboard, and seeing an empty black page when attempting to add a new payment method. A: If you were a user of Normal Chat, this is likely the cause. Try creating a new account on Normcore’s dashboard and trying again. If you still run into errors, please reach out to email@example.com.
Q: I’m using a plugin like VRIK for IK movement on avatars. When I teleport or move my camera rig, the camera rig moves, but my avatar stays in the same place.
A: If your avatar setup is using a camera rig, you’ll need to wire it up to
RealtimeAvatarManager’s Root, Head, and Hand transform inputs. This will make sure your
RealtimeAvatar instance matches the transforms of your camera rig instead of using the default Unity XR APIs.
Q: When I try to sync a specific object to the Datastore, I get an error that says it cannot be serialized. A: Only primitives can be serialized (int, float, vector3, string, etc.) so if you need to sync a specific kind of object like a TextMesh Pro object, you’d want to create a field for each property on the object that you’d like to sync.
Q: I’m using a
RealtimeArray to store a list. I’m able to add to the list, but don’t see a remove function?
A: This is intentional. An alternative would be to use a
Before diving into anything ownership related. We highly recommend reading our guide on how ownership works with models.
Q: How can I check who is holding an object?
A: We recommend using the owner of the
RealtimeTransform component to do this. When you grab an object, you’ll need to call
RequestOwnership() on the
RealtimeTransform in order for others to see your updates to it. Other clients can then use the ownerID property on
RealtimeTransform to see who is holding it.
One exception is if you’re using a rigidbody with
RealtimeTransform. When you throw an object, you’ll want to remain the owner so you can simulate it flying through the air.
RealtimeTransform will clear the owner when the object comes to rest. In this case, we recommend checking if it’s kinematic (being held) and then who the owner of the
RealtimeTransform component is.
Q: I have a
RealtimeView in my scene, and the first client to connect can make changes to
RealtimeComponents that are attached to it, but other clients cannot.
A: Uncheck “Owned by Creating Client” under Advanced Settings and re-export your build.
Q: I’m walking through the Color Sync tutorial, and when I try to change the cube color, nothing happens.
A: Uncheck “Owned by Creating Client” under Advanced Settings and re-export your build. Your standalone build is becoming the owner of the ColorSync
RealtimeComponent/model when it’s launched. Then when you hit Play in the editor, it’s unable to make changes because the
RealtimeView is owned by the standalone build. Unchecking “Owned by Creating Client” ensures the
RealtimeView will have no owner, and will be editable by everyone in the room.
Q: I’m using a Custom Realtime Component and I want to limit manipulation of it to only the user who is currently interacting with it.
A: An easy way to limit who can manipulate a
RealtimeComponent is to take over ownership of the
RealtimeView it’s attached to. If a
RealtimeView (or any parent view) is owned by a client, only that client will be able to make changes to it or any of its components.
However, if you need
RealtimeComponents on a single game object to be locked to different clients, you can enforce ownership by adding a metamodel to your
RealtimeComponent’s model. To do this, in your custom model, change the
model attribute to:
[RealtimeModel(true)] and regenerate. This will add methods where you’ll be able to request ownership which prevents other clients from modifying the model.
Q: I’m unable to change the values on a model on someone else’s avatar.
RealtimeAvatarManager ensures the root
RealtimeView of your avatar is owned by the creating client. This prevents any other clients from modifying any
RealtimeComponents or models on avatars besides their own.
Physics and collisions
Q: I have an object with a rigidbody attached and a
RealtimeTransform on it. When I start my application, I expect the object to fall based on physics simulation, but it stays in the air.
A: This is because the object does not have an owner set (or the owner is not present in the scene), and so it will not simulate as expected until an owner is set. Currently, you’ll need to manually call RequestOwnership on the
RealtimeTransform in order to begin simulating the object. In an upcoming update, the
RealtimeTransform will also pay attention to the view’s ownership, in which case you’ll be able to have the scene view owned by the first client to connect.
Q: What’s the proper way to create a room/get player count manually?
A: We recommend using the
RealtimeAvatarManager’s avatars list to get a count of how many other players are in the room. We’re currently working on more detailed APIs to Query the number of connected clients for each room.
Q: I’d like to get the string of the room name set in the inspector at runtime. Is there a way to do this?
Realtime._roomToJoinOnStart is private, you should call the
Connect() method yourself and keep track of the room name you pass to it on your own. Then you’ll be able to pull the current room name at runtime.
Q: Is there a user account / management system implemented? A: Not yet, but this is something we are looking into adding in a future Normcore update.
Q: How can I retrieve a list of all connected users / current rooms? A: This is coming in a future update of Normcore
Q: How can I reset all the data in a room? A: This is coming in a future update to our Dashboard.
Q: Is there a callback for when all models are deserialized and prefabs are instantiated for a room?
didConnectToRoom event will fire once this is done.
Q: How can I handle teleportation and locomotion? A: We have a guide coming for this, it will be added to the Normcore documentation soon.
Q: What’s a good best practices workflow for developing VR multiplayer, in terms A: of testing multiple clients on the same machine? We encourage you to test as often as possible with other users and networked machines/devices, but you are able to test using the same machine by creating a standalone build.
Run the standalone build(s) and run an instance of the application in Unity’s play mode. You should see the clients connect to the room and data syncing as expected.
Note: You’ll need to disable VR support on the standalone build in order for Unity Editor to enter play mode while the standalone build is running. SteamVR will only allow one VR app to run at any given time (i.e., play mode in Unity will kill any VR standalone builds.)
Networking and connections
Q: I have animations or other logic that I need to reliably keep in sync across clients. What’s the best approach?
Realtime.room.time is a double that is a Unix timestamp, synchronized between all players. Usually, we expect this to be accurate in a range of +/- 5ms. Once in sync, the clock will not jump, so animations should remain smooth.
Q: Does Normcore send a keep alive packet? A: Normcore doesn’t use a dedicated keep alive packet. Every network frame, Normcore will send out a packet from the server and clients. The header includes information about the connection. If there is no information to send, just the header is sent. This usually equates to about 4 bytes (for an empty packet).
Q: I’m using hotel / conference center / enterprise WiFi and unable to connect to Realtime. A: Normcore uses UDP traffic for Realtime communication. Many enterprises, airports, hotels, and conference centers block UDP traffic as it can be used maliciously. If possible, move to a network or hotspot that doesn’t block UDP traffic, or speak with your network admin about getting UDP traffic unblocked on your network.
Q: Sending my app into the background causes the persistent content to disappear when I return/re-open the app on Android. A: Android pauses apps entirely when backgrounding them. You may be able to change your app manifest to allow network connectivity in the background. By default, Normcore is set up to keep the server connection alive even when an app is paused. However, if Android is killing your connection based on OS limitations, you will be disconnected.
Q: I’m getting a
Log: [Error]Reliable channel send Queue is full. What could cause this?
A: You likely have a model with a reliable property that’s changing too often. Because reliable updates can’t be thrown out, they’ll stack up until the Queue is full. Check your reliability settings and see if you can tweak anything.
Q: Does Normcore work with WebGL platform? A: No, unfortunately, because websockets are TCP only which doesn’t support unreliable messages, and Normcore relies on UDP which is not currently supported.
Voice chat and audio
Q: What codec does Normcore use to encode VOIP audio? A: Normcore uses Opus 1.3 for encoding/decoding VOIP audio.
Q: Is there a method to change the microphone Normcore uses?
RealtimeAvatarVoice.cs, we call
RealtimeAvatarVoice (line 123) with an empty string which passes the system default audio input device to Unity as the microphone. You can change this empty to string to your desired device’s identifier to pass whatever source you’d like to Unity.
Q: Can I change the volume of a user’s voice? A: We recommend setting the volume of the audio source component on each avatar, or if you’re doing more complex attenuation, create an audio effect that operates on the audio samples directly.
Q: Can I disable voice chat?
A: To completely remove voice chat, remove the
RealtimeAvatarVoice component from your player preab. For a temporary disable, there is a ‘mute’ property on
RealtimeAvatarVoice (on the head component). Note: You can also mute
RealtimeAvatarVoice on remote player’s prefabs, in which case, the local player will not be able to hear the muted player. Additionally, you can mute the local player’s
RealtimeAvatarVoice component to stop sending their voice to all clients.
Cross platform development
Q: I have conflicting project settings for different platforms (e.g. AR vs VR), how do you suggest abstracting source files between project files to keep shared components in sync and project settings separated as needed? A: This is the approach we use internally: create different Unity projects for each of the conflicting platforms. Name these “YourProjectName-Platform” (i.e. NormalChat-AR, NormalChat-VR) Each one should be empty except for ProjectSettings and any assets / third-party packages that are unique to that platform’s project (i.e. SteamVR, Oculus, Magic Leap plugins).
Then we create an empty git repo called YourProjectName-Common. We use git to add this repo as a submodule in the Assets folder of YourProjectName-VR / YourProjectName-AR projects. This way, all projects share the same common resources, but can keep settings and platform-specific resources isolated.
You’ll need to remember to make commits to the YourProjectName-Common repo and then YourProjectName-VR/YourProjectName-AR repos separately.
Q: I’m building for iOS / ARKit, and the Xcode project always has errors. Without Normal, it compiles fine. A: It’s likely you are missing a linked library. Check out the FAQ at the bottom of our AR Spectator Guide in the Normcore documentation for more details.
Magic Leap development
Q: I’m working on a Magic Leap application. Are there any specific steps I need to take to get Normcore functioning on device?
A: As of writing, there are some differences in how the Magic Leap SDK works that require the following steps to be done in order for Normcore to properly connect on device deployments. Big thanks to Discord user
Trevor#4846 for helping uncover these steps:
1) Drag the Realtime + VR Player prefab into the scene and enter your App Key. Uncheck Join Room On Start in Realtime + VR Player details pane.
2) Add the Privilege Requester script to an object that will request permissions from the user. Change the Privileges size to
1 and set it to Audio Capture Mic. Add the functions from
3) Ensure that you have the
Plugins/Lumin/manifest.xml file in your project. Make sure to add the privileges for Internet and Audio Capture Mic. In your Unity project, go to
Assets/Normal/Realtime/Native/Plugins/Android/arm64 and select
4) In the inspector, select “Lumin” for an included platform.
Note: The default VR Player character prefab is completely black. Consider changing it to make it more visible on device.
Q: I’m working with a Magic Leap and avatars aren’t moving around when the device moves.
A: Magic Leap uses “Main Camera” for the HMD’s transform information. You should drag the “Main Camera” component from your scene into the “Head” component option under
RealtimeAvatarManager’s head/hand positions.
Synchronize multiple devices in the same space
Q: Does Normcore include any features natively for syncing real world space between users in the same physical location? i.e., syncing the location of digital objects to appear in the same relative location in the real world? A: This is something that some platforms have added support for (i.e. iOS ARKit has anchors, ARCore has pins, Magic Leap uses PCFs.) Normcore doesn’t handle any of this syncing automatically, but you can serialize the needed data and use your platform’s native shared location methods to synchronize virtual spaces to real world spaces.