Migrating to Google Shared/Team Drive

Posted by & filed under Uncategorized.

We recently needed to move about 500GB of data over 5 years from a ton of people we no longer work with into a Shared Drive as part of our Gsuite. This move is intended to give us more full control over ownership and management of files.

Unfortunately Google provides no way to migrate files and folders that aren’t owned by you or anyone in your org (domain). IE if a file is owned by a *@gmail.com address (even if it’s mine!), I cannot move it to our Shared Drive under the *@Brightrockgames.com domain.

The only way to achieve this migration is by copying all files and folders. But again, Google provides no way to do this. If you try to do this via the web interface you’ll have to do it manually file by file. Google File Stream doesn’t support creating or copying gdoc files and folders either.

So what now? We turn to code! The below Google Script will copy all files and folders into a target folder. If the file already exists it’ll skip it.

Unfortunately it is still not perfect – Google Scripts have a bunch of quotas limiting them such as daily executions and total Drive operations allowed. So this script will have to be run repeatedly over possibly a week. But at least it should do it perfectly, eventually. :/

Using ‘git sync’ to automate common Git commands

Posted by & filed under Uncategorized.

At Brightrock Games about 5 of us actively use Git hourly to commit and pull work all day from the same branch. Doing this often means that every time you want to do a git pull or push, there are several manual steps you have to take to stash any outstanding changes you haven’t yet got around to committing (Or any unnecessary trash Unity makes) so you can then pull rebase if there is something upstream then finally push. Its an annoyance that wastes a few minutes every few hours.

Git best practices will suggest that each developer works in their own branch until they’re ready to merge, but in practice when the whole team is rapidly pumping out bug fixes and improvements its just easier to be working together.

To improve our workflow I have created a git alias that performs a git stash -> git pull –rebase -> git push -> git unstash called git sync.

On a command-line paste in the following to create a global git alias:

That’s it.
For reference here is the non-minified script code:

 

Migrating your project to Git LFS

Posted by & filed under Game Development, Unity3D.

For the past 4 years we have been using Git as our repository for our game War for the Overworld. This includes 18,000+ commits totaling to a massive a 35GB repo. Its huge. So huge that our host, BitBucket, falls apart if one tries to clone the repo anew. I don’t blame BitBucket for this – At some point after we joined them they added a 1GB repo size limit and have been gracious enough to not apply that restriction on us. Ultimately the problem lies with Git itself – It wasn’t built to handle all the binary data we have thrown into it.

For the past few months I’ve been looking into migrating our giant repo into Git LFS and then pushed up to GitHub, our new home. This mission has been a lot more painful than I had imagined and so I am recording the steps I took to make it happen.

Mirroring the repository

First thing we need to do is mirror the remote repository so we can perform the LFS migration before pushing back up to our new remote (GitHub). Doing this is quite simple:

4 hours later…we move on.

Migrating to LFS

Okay. Now for the serious business and to explain a few things. What I’m doing here is a deep migration to LFS. What this means is that I am walking through every single commit in the entire Git history and replacing every single binary file I’ve identified with its LFS representation and uploading that file to our remote host. This kind of solution works best if you’re also going to be pushing the resulting repo to a new location (Even on the same host). If you want to simply rewrite a few files into LFS for a pre-existing hosted repo then you may be best using the official Git LFS Migrate tool and tutorial.

In this case we need to make use of some fairly dark magic made easy thanks to this tool. Once downloaded, it is run like so:

-s is the folder of the mirror we downloaded in the previous step.

-d is a new folder where the new migrated mirror will go.

-g is the new remote host location. This is necessary because this tool begins uploading the new LFS files immediately during this process.

Couple of things to note:

  • To Unity devs – I left out the *.unity  file type as I found that we changed scenes very often and this created an overly large LFS store due to each change creating a new binary file. Unity scene files can also be diffable by Git, so despite their large file size I’ve found they are fine left in plain Git.
  • This tool isn’t perfect – If your repo is big enough and the process is long enough, the remote host (GitHub) will eventually kick you off your session and the tool will crash. In this case you need to delete the converted/war-for-the-overworld.git/objects/  folder and restart the migration process. Luckily the LFS files you’ve already uploaded will be skipped, so eventually it will complete! It took me days to do our repo.

This will kick-off the very long process of walking Git history, rewriting files and commits, and uploading the new LFS files to the remote LFS store. This does NOT migrate the repo itself to the new remote location though…

Uploading to the remote host

The final part – Pushing to the remote host! This should be the easiest part, but it wasn’t for us…

First things first, lets clean up the mirror before we push it:

Running a git gc  will cause git to remove any loose files and garbage as well as compress all the file blobs. This might take awhile…

Once done, you simply run the below command to upload it to the remote host:

If your repo is <1GB in size, this should complete without much of an issue and you can walk away from this article.

If you’re in my position, where the resultant repo is still fairly large (Ours is 2.5GB), then you’re going to need to enlist the help of GitHub support to temporarily relax 2 key limits:

  • GitHub has a 100MB total file size limit – We hit that with some of our old Unity scene files and I didn’t want to put them into LFS as they’re diffable plain text files, and doing so blew up the LFS store by a factor of 5.
  • GitHub has a 1GB single-push limit – Since our repo is well over 1GB and that Git does not provide a way to push a mirror on multiple parts, we could not push our repo without being kicked-off by GitHub when we hit 1GB+ uploaded.

Thankfully GitHub support have been accommodating to us and have relaxed these limits for the purposes of getting the repo up there.

Extras

  • You can update an existing local mirror with git remote update .

Creating a cutout shader for doors and windows

Posted by & filed under Programming, Unity3D.

For the game I’m currently working on is a 3D isometric management game where you build rooms on a spacestation. The rooms and spacestation are made up of a series of tilesets so the player can effectively paint out the rooms in any size and shape they wish and the wall tiles are snapped together and appropriately themed based on the room type. All relatively easy stuff at this point. However we also allow the player to place doors and windows, and these objects may be multiple tiles wide. The first solution that came to us was seeing whether we could complexify the tileset system a little to support sub-tilesets that are made up of chucked up doors and windows. After some quick math this didn’t look good – The tileset requirements and combinations were quickly exploding into really big numbers. This could have been possibly mitigated by cutting up the tiles even further and then having some automated tileset builder step that constructs the combinations required. But that sounded no good. There must be another way!

2016-07-29_19-24-28

Read more »

Improving the FBX workflow between 3ds Max and Unity3D

Posted by & filed under Uncategorized.

As some of our artists swear by 3ds Max we must support it and ensure the workflow between it and Unity3D is as seemless as possible. Unfortunately even though 3ds Max and among the top few 3D modelling applications there are still some serious workflow issues with it. One of the common tasks one performs – Exporting 3D models to the FBX file format – is fraught with problems:

  • 3ds Max uses Z-Up axis whilst Unity3D is Y-Up. The FBX Exporter doesn’t do this properly and Unity3D detects this and adds a -90 transform rotation to the imported object.
  • Before exporting each object, the artist must zero out the position so it doesn’t import out of [0,0,0].

Having to do this for many objects can be extremely time consuming for the artist. We need something that can automate this! And batch this!

Thanks to a lot of work by Jos Balcaen I have been able to modify and improve his batch exporter to support a few Unity oddities.

Download the MaxScript here.

2016-07-14 18_30_04-Clipboard

Why is Physics.UpdateBodies using up so much time? 20ms to 2ms

Posted by & filed under Game Development, Programming, Unity3D.

As part of performing some initial prototyping and load testing for our next game we had to determine how to implement the physics system. For our game we required a dynamic world and the ability to simulate 100-500 agents. That kind of target is right on the edge of needing to write a bespoke system so as to take every advantage one can to meet a 60 FPS minimum. We’re long-time Unity3D devs and were quite wary of its implementation of PhysX, but we had to try it anyway – The tooling and maintenance bonuses of using it would be of great value let alone not having to write our own implementation.

Our test consisted of a single large floor collider and the “Unity Guy – Ethan” rigged model and animation assets (Rigidbody + Capsule Collider components), with a simple script that has him run around a couple of rally points. No pathfinding. We would spawn 500 of these agents and see what the frame times look like with them all within the camera frustum and, more importantly here, when they’re outside of the camera frustum.

There were a whole bunch of small optimizations we tweaked and played with, however there was one striking issue that really stumped us:

2016-07-01 18_40_29-

What the fuck? 20ms?! Wow Unity’s physics system is SO. SHIT!

Actually. It’s not. It took the better part of a day and a Unity Dev to explain in way too few words what this magical Physics.UpdateBodies method was:

“it’s sending all the transform update messages.”

Huh. Okay. So after digging around for a bit it turns out the problem is this: For whatever reason, this method deals with sending back the transform updates of every transform under a Rigidbody. The model we were testing with has, as with all imported rigged models, its default skeleton structure laid out in 20-30 empty GameObjects. It’s a well known optimization that one must trim these as much as possible and it’s something we would do when we got around to really implementing animated characters. However it seems quite insane for it to impact Physics.

In either case the solution is simple – Remove the dead transforms (You can read about optimizing rigged models here) and…

2016-07-01 18_48_49-

20ms to 2ms. Crazy. Simulating 500 or even 1000 active agents is actually possible right now. I wonder how many Unity developers dump the physics system without realizing this optimization.

Important Tips on Hiring a Video Games Writer

Posted by & filed under Game Development.

I’m Scott Richmond, the Producer and a programmer at Brightrock Games. I am lucky enough to have been able to go through the experience of Kickstarting and, as of April 2015, successfully releasing our game War for the Overworld. WFTO had a considerable amount of writing in it as well as voice acting, and in this article I will be taking this experience and discussing how we aim to tackle the writing department with our new game title. Just before I start, I’d like to thank Chris Avellone for taking the time to discuss with me the inner world of video game writers.

Read more »

How to Improve the Performance of Unity3D Animations

Posted by & filed under Uncategorized.

In our game War for the Overworld we make pretty heavy use of animations with users being able to potentially see 25-100 units on screen at once. This means that the Animation Renderer takes up a considerable part of our frame time, and as such any tuning there can create a noticeable performance improvement. Recently I decided to take another look and found what I think is a lesser known Unity feature – Optimize Transform Hierarchy. Using this feature reduced our Animation Renderer overhead by 50%!

2015-09-21 11_46_14-

Executing this feature on an animated prefab will cause it to remove all those empty bone Game Objects.

Why might you want to do this? Because it costs precious time for Unity to move those Game Objects along with the actual animations. It does not affect the animations themselves. But do note that this will mean you cannot attach arbitrary child objects to your model – For example a weapon to its hand and have it correctly follow the animation. Also be aware that it will delete any Game Objects you might have put somewhere into the bone hierarchy. This can be solved with a bit of work, however.

Option 1 – Set everything up manually again

Often you’ll want at least some of the bone Game Objects available for use, such as the hands and head. You can manually dictate what bones Unity will create a Game Object for in the Rig tab for the model importer.

2015-09-21 11_59_00-Skype™ [1] - scott@strichnet.com

Note that this wont automatically update the model prefab if you’ve already made one, so you’ll need to recreate the prefab with the new mesh.

Option 2 – Use a script to automate it!

We have some 70 units in our game, so manually updating all the prefabs was a going to be a bad idea. I created a Unity Editor script that takes a set of prefabs to be consumed. It will search each prefab for a Skinned Mesh, flag any bones that contain a custom Game Object, add them as bones that need to stay, then finally it will optimize the mesh. It worked on our messy unit prefabs so hopefully it’ll be fine for you too, however it might not cover all edge cases so make sure you double check everything is good afterwards.

Tuning Git for large binary repositories

Posted by & filed under Uncategorized.

Git isn’t particularly well suited for video game repositories where you want to version binary files such as art assets and at some point it will start to crack at the seams. Currently there is no 64bit Git build for Windows and at some point your repository will begin to crash with Out of Memory exceptions when performing heavy tasks such as Git GC.

Until a 64bit build is released for Windows one can make the following changes to ensure Git doesn’t hit the 32bit memory limit.

In .git/config add the following:

In .git/info/gitattributes we will set a number of file extensions to binary and remove the delta diff functionality from them:

 

Using Meraki to deploy SMB networks

Posted by & filed under Networking.

I’ve recently become very infatuated with the Meraki brand of routing, switching and wireless product line. Recently bought out by Cisco, Meraki products primarily rely on an extremely neat cloud-based software management service that comes with every Meraki device licence.

Its funny how I’ve seen my colleagues react to the way the Meraki products work with their tight coupling with their cloud service – Some love the idea and some hate it. Personally I see this kind of solution being the primary solution for all small-medium businesses in the near future, with the CLI-based network solutions that come with expensive engineers and management dieing out. Which is I suppose why Cisco bought Meraki for some $1.2 billion.

I’m not going to run a marketing speech for these guys but my elevator pitch is this: Meraki products are in the same price range as Cisco products however the time saved in initial deployment, future management, and additional features makes them a much cheaper option.

The key points I typically harp on about are:

  • All Meraki products have one licence and all software features are unlocked.
  • All Meraki products come with software-based layer 7 firewalls, user traffic tracking and management, WAN optimisation, VPNs, and more – Outside of enterprise and the datacentre you don’t need to buy any additional appliances. Don’t need Palo Alto’s or F5’s.
  • The cloud management service is actually nice – Updates just happen. As the consultant you’re much more likely to be able to simply walk away after deployment.
  • Configuration of some of the more complex features (VPNs) are actually very simple – In my experience deployment time-frames are a fraction of the time usually spent on our favourite CLI-based product.

I have produced a solution template for SMBs that can help consultants quickly price up site solutions. You can download the PDF here or go to my GitHub page here for the Visio diagram.

2014-10-10 16_10_04-Clipboard