Half-Life 1 Custom Map – Chaocity3 by Sulsa

Posted by & filed under Uncategorized.

This map was an important map for me and my friends during our LAN days way back in the 2000s. It took me some time to uncover it again so I am mirroring it here.

Download it here

Note that this map was made for Half-Life 1 GoldSource (NOT the newer “Source” version).

Author’s Notes

---	
CHAOCITY 3.0 

by  S U L S A
---

Type: Halflife Deathmatch map
Players: 6-? players
Filename: Chaocity3.bsp

Email: [email protected]

=======================

THE STORY CONTINUED:  

Revenge of the damned...

	As the world begins to degenerate into a seething globe of distrust, war and famine, there are those that view it as an opportunity.  The partial success of the Chaocity experiment to develop a super-soldier for the United States military has led to more funding and broader experimentation.  Though hardly perfect, the project results are nearing the point of marginal usefulness.  The progress is being watched closely, for timely deployment on frontlines around the world are reaching a critical necessity.

It is at times like this, with pressure from the ones that provide the funding and incentive, but who have little concept of the dangers and hazards in this type of undertaking, that mistakes are made.  Such a mistake has happened in a new section of 'Chaocity' as the prototypes near production level effectiveness.

As the experimentation continues on the unwilling participants, the combat training begins to involve weapons capable of more massive destruction.  At the same time, Defense Department officials overseeing the project they are funding (and hoping to absolve themselves of the Black Mesa fiasco through this project's success) are putting increased pressures upon the scientists overseeing 'Chaocity'.
As the scientists put in longer days, frustration and sleep deprivation begin to take their toll.  More and more, safety measures are disregarded in order to achieve quicker and larger goals towards a 'working model'.  It is at the peak of one of these pushes towards a breakthrough when things take an unexpected and horrifying turn for the scientists and personel of Chaocity.

An under staffed security/observation tower in quadrant 7c does not notice the test subjects gaining access to a large weapons and explosives depot.  The depot itself, located directly above a major access tunnel is promptly destroyed.  The resulting explosion collapses a section of the tunnel and gives the battle crazed soldiers above access to the massive infrastructure below. 

The soldiers, who know nothing but violence, death, fear and bloodlust, pour into the sub-level areas of the Chaocity experiment, hoping to find a way out as well as punish those who have condemned them to such a life.  What they find is a fantastic expanse of tunnels and laboratories that run underneath all of Chaocity.  Among other things, the soldiers find alien mag-lev technology on a huge underground transportation system and advanced teleport technology.  
They also find fierce resistance from the scientists who have found themselves a part of the experiment that they themselves had created.  Gun turrets and defenses are hastily erected while tunnels are blown to seal off the attackers, but ultimately the massive underground labyrinth becomes nothing more than a tomb.
 
 
=======================

THE MAP ITSELF:

	Chaocity 3 is the final map in the 'Chaocity' HLDM map series.  All of these maps were made to be very large, and this one is definately the largest of the 3.  This map got it's inspiration from the beginning tram ride levels of the original HLSP game.  I was looking to recreate the massive architecture and breathtaking feeling of those introductory levels while providing an excellent place for fighting.

	This is the third map I have released to the public. After successfully completing CC1 and CC2, I wanted to see how much beyond their size I could get.  It took over 8 months to finish and is being released nearly to the day a year after Chaocity 2.  Based on the knowledge from the first 2 Chaocities of how to optimize Zoner's compiling tools, I still could have put more into this map and made it larger.  Running out of patience and inspiration, I decided enough was enough. 

	All the weapons in the Halflife universe are here.  You will find 3 user controllable turrets, 4 teleporters and 2 trains that use mag-lev technology to get around.  I did not put any traps or special tricks in this map, so to those who may be disappointed:  sorry! 

As I said in CHAOCITY 2, and should be repeated here:
There are very few BIG maps out there, and there are even fewer people who like the 'slower' type of game that goes along with the big size.   There are however, a huge amount of small and medium sized maps of high quality, so if this is too big for you, you have thousands (literally) of others to choose from.  
But if you are one of the few who love to play on the 'big chuggers', then I hope you enjoy Chaocity 3.

<:::WARNING:::>
This map in particular may not run AT ALL on some 'low end' computers because of the amounts of texture memory required.  
I did not build this map for everyone.  
I did not build this map to score points on map review sites.
I built this map for the people who liked the first 2 maps in this series and who wanted to take a trip back to Chaocity for one last time.  
<:::WARNING:::>


=======================

THE CUSTOM SKY:

	You must put the 6 image files (6 TGA) in your \\...Valve\gfx\env folder.  It is an improved sky from Chaocity 2.  Overwrite your original.


=======================

TECHNICAL STUFF:

Hardware:
	-AMD ATHLON 1.1 ghz processor with 512 mb RAM
	-GeForce 2 GTS 32mb video card
Software:
	-Worldcraft 3.3
	-ZONER'S Tools
	-Wally
	-Spriteview
	-Pakscape
	-Adobe Photoshop 6
	-Adobe Illustrator 9.0

Compiling Time:
	-1h 31m 5s

Geek #s:
	-67471 base patches
	-visibility matrix   :  30.0 megs (because of '-sparse' switch on Zoner's Tools)
	-1295770 square feet [186591008.00 square inches]
	-3318 direct lights
	-12155 faces
=======================

THANK YOUs:

	-Dr. Dags, Huggie Bear and Flash.  PRE-beta testers and great friends for years.  (That's all four of us in the picture).
	-HL community members Primate (lead animator for Ronin Software), Davros (creator of 'Tardis' and 'Davroplex') and Mustang (owner/creator of Globalassault.com).
	-Clan [CMC] (Canned Meat Clan) for the actual Beta-testing and bug scrubbing; a kick-ass server [CMC] Blasterpiece Theater, great ideas and suggestions for improvement.  It is the greatest thing that we all survived the implosion of 'iCAST.com', and will be great friends for life.  You guys are THE BEST.
	-The web resources like GlobalAssault.com and the Halflife ERC.  
	-The guys who frequent the 'RUST gamedesign forums'.  If it wasn't for them and the many others like them out there, there wouldn't be ANY amateur game designers.  What I have learned from those guys over the past year and a half has given me the ability to make the maps I want to make.  Plus they are a great community and a hilarious bunch...
	-Most of all to the die hard Halflife fans who still keep this awesome game alive.
	
=======================

COPYRIGHT STUFF:

	This map is ALL MINE, except for two things:

	-HWARANGDO, the most comprehensive system of deadly fighting on the planet.  www.hwarangdo.com

	-GLOBALASSAULT, the best map review/depot site on the net.  3 years old and going strong.  www.globalassault.com

	I made this map from scratch.  It belongs to me.  Everything is original.  But you know what?  Use whatever you want out of it!  Turn it into whatever you want or take whatever you want for prefabs...ON ONE CONDITION:  Be cool, be honorable and GIVE CREDIT WHERE CREDIT IS DUE.

Once again, it has been a pleasure making this map for all the HL shooters out there, and an honor to count myself amongst the designers.


o--[-----

S u l s a

12-17-2001
	 

External Links

Migrating to Google Shared/Team Drive

Posted by & filed under Uncategorized.

We recently needed to move about 500GB of data over 5 years from a ton of people we no longer work with into a Shared Drive as part of our Gsuite. This move is intended to give us more full control over ownership and management of files.

Unfortunately Google provides no way to migrate files and folders that aren’t owned by you or anyone in your org (domain). IE if a file is owned by a *@gmail.com address (even if it’s mine!), I cannot move it to our Shared Drive under the *@Brightrockgames.com domain.

The only way to achieve this migration is by copying all files and folders. But again, Google provides no way to do this. If you try to do this via the web interface you’ll have to do it manually file by file. Google File Stream doesn’t support creating or copying gdoc files and folders either.

So what now? We turn to code! The below Google Script will copy all files and folders into a target folder. If the file already exists it’ll skip it.

Unfortunately it is still not perfect – Google Scripts have a bunch of quotas limiting them such as daily executions and total Drive operations allowed. So this script will have to be run repeatedly over possibly a week. But at least it should do it perfectly, eventually. :/

var _cache = CacheService.getUserCache();

function start() {
  var sourceFolder = "0B9icZteiWC_OXzhQbkgyNXVfaDQ";
  var targetFolder = "1DVsXFojkfURxj4yIR4VVpLQYP2b-T6xh";
  
  var source = DriveApp.getFolderById(sourceFolder);
  var target = DriveApp.getFolderById(targetFolder);
  
  copyFolder(source, target);  
}

function copyFolder(source, target) {
  var cachedFolder = _cache.get(source.getId());
  if(cachedFolder != null) return;
  
  var folders = source.getFolders();
  var files   = source.getFiles();
  
  while(files.hasNext()) {
    var file = files.next();
    var fileName = file.getName();
    
    var existingFile = getFileByNameIn(fileName, target);
    
    if(existingFile == null) {
      try { 
        file.makeCopy(fileName, target);
        console.log("Copied " + fileName + " file.");
      }
      catch(e) { console.log("FAILED to copy " + fileName + " file."); }      
    }
  }
  
  while(folders.hasNext()) {
    var subFolder = folders.next();
    var folderName = subFolder.getName();
    
    var targetFolder = getFolderByNameIn(folderName, target);
    
    if(targetFolder == null) {
      var targetFolder = target.createFolder(folderName);
      console.log("Created " + folderName + " folder.");
    }
    copyFolder(subFolder, targetFolder);
  }
  
  _cache.put(source.getId(), "", 21600);
}

function getFileByNameIn(name, parentFolder) {
  var files = parentFolder.getFilesByName(name);
  
  if(files.hasNext()) return files.next();
  else return null;
}

function getFolderByNameIn(name, parentFolder) {
  var subfolders = parentFolder.getFoldersByName(name);
  
  if(subfolders.hasNext()) return subfolders.next();
  else return null;
}

Using ‘git sync’ to automate common Git commands

Posted by & filed under Uncategorized.

At Brightrock Games about 5 of us actively use Git hourly to commit and pull work all day from the same branch. Doing this often means that every time you want to do a git pull or push, there are several manual steps you have to take to stash any outstanding changes you haven’t yet got around to committing (Or any unnecessary trash Unity makes) so you can then pull rebase if there is something upstream then finally push. Its an annoyance that wastes a few minutes every few hours.

Git best practices will suggest that each developer works in their own branch until they’re ready to merge, but in practice when the whole team is rapidly pumping out bug fixes and improvements its just easier to be working together.

To improve our workflow I have created a git alias that performs a git stash -> git pull –rebase -> git push -> git unstash called git sync.

On a command-line paste in the following to create a global git alias:

git config --global alias.sync '!f() { bold=$(tput bold); normal=$(tput sgr0); changes=false; if [[ `git status --porcelain` ]]; then changes=true; echo \"${bold}Changes detected, stashing...${normal}\"; git stash save --include-untracked; else echo \"${bold}No local changes, skipping stash${normal}\"; fi; echo \"${bold}Rebasing and pushing...${normal}\"; git pull --rebase && git push; if [ \"$changes\" = true ] ; then echo \"${bold}Unstashing changes...${normal}\"; git stash pop --quiet; fi; }; f'
That’s it.
For reference here is the non-minified script code:
bold=$(tput bold);
normal=$(tput sgr0);
changes=false;

if [[ `git status --porcelain` ]]; then
  changes=true;
  echo "${bold}Changes detected, stashing...${normal}";
  git stash save --include-untracked;
else
  echo "${bold}No local changes, skipping stash${normal}";
fi;

echo "${bold}Rebasing and pushing...${normal}";
git pull --rebase && git push;

if [ "$changes" = true ] ; then
  echo "${bold}Unstashing changes...${normal}";
  git stash pop --quiet;
fi;

 

Migrating your project to Git LFS

Posted by & filed under Game Development, Unity3D.

For the past 4 years we have been using Git as our repository for our game War for the Overworld. This includes 18,000+ commits totaling to a massive a 35GB repo. Its huge. So huge that our host, BitBucket, falls apart if one tries to clone the repo anew. I don’t blame BitBucket for this – At some point after we joined them they added a 1GB repo size limit and have been gracious enough to not apply that restriction on us. Ultimately the problem lies with Git itself – It wasn’t built to handle all the binary data we have thrown into it.

For the past few months I’ve been looking into migrating our giant repo into Git LFS and then pushed up to GitHub, our new home. This mission has been a lot more painful than I had imagined and so I am recording the steps I took to make it happen.

Mirroring the repository

First thing we need to do is mirror the remote repository so we can perform the LFS migration before pushing back up to our new remote (GitHub). Doing this is quite simple:

git clone --mirror [email protected]:subtgames/war-for-the-overworld.git

4 hours later…we move on.

Migrating to LFS

Okay. Now for the serious business and to explain a few things. What I’m doing here is a deep migration to LFS. What this means is that I am walking through every single commit in the entire Git history and replacing every single binary file I’ve identified with its LFS representation and uploading that file to our remote host. This kind of solution works best if you’re also going to be pushing the resulting repo to a new location (Even on the same host). If you want to simply rewrite a few files into LFS for a pre-existing hosted repo then you may be best using the official Git LFS Migrate tool and tutorial.

In this case we need to make use of some fairly dark magic made easy thanks to this tool. Once downloaded, it is run like so:

java -jar git-lfs-migrate.jar `
-s war-for-the-overworld.git `
-d converted/war-for-the-overworld.git `
-g https://USERNAME:[email protected]/BrightrockGames/war-for-the-overworld.git `
--write-threads 16 `
"*.bmp" `
"*.FBX" `
"*.TTF" `
"*.PNG" `
"*.cubemap" `
"*.dylib" `
"*.exe" `
"*.eps" `
"*.exr" `
"*.gif" `
"*.jpeg" `
"*.jpg" `
"*.png" `
"*.psd" `
"*.svg" `
"*.tga" `
"*.tif" `
"*.mp3" `
"*.ogg" `
"*.wav" `
"*.aiff" `
"*.avi" `
"*.flv" `
"*.mov" `
"*.mp4" `
"*.mpg" `
"*.wmv" `
"*.ogv" `
"*.fbx" `
"*.obj" `
"*.rar" `
"*.zip" `
"*.bin" `
"*.pdb" `
"*.dds" `
"*.dll" `
"*.exr" `
"*.lxo" `
"*.otf" `
"*.pdf" `
"*.rns" `
"*.tga" `
"*.ttf"

-s is the folder of the mirror we downloaded in the previous step.

-d is a new folder where the new migrated mirror will go.

-g is the new remote host location. This is necessary because this tool begins uploading the new LFS files immediately during this process.

Couple of things to note:

  • To Unity devs – I left out the *.unity  file type as I found that we changed scenes very often and this created an overly large LFS store due to each change creating a new binary file. Unity scene files can also be diffable by Git, so despite their large file size I’ve found they are fine left in plain Git.
  • This tool isn’t perfect – If your repo is big enough and the process is long enough, the remote host (GitHub) will eventually kick you off your session and the tool will crash. In this case you need to delete the converted/war-for-the-overworld.git/objects/  folder and restart the migration process. Luckily the LFS files you’ve already uploaded will be skipped, so eventually it will complete! It took me days to do our repo.

This will kick-off the very long process of walking Git history, rewriting files and commits, and uploading the new LFS files to the remote LFS store. This does NOT migrate the repo itself to the new remote location though…

Uploading to the remote host

The final part – Pushing to the remote host! This should be the easiest part, but it wasn’t for us…

First things first, lets clean up the mirror before we push it:

cd converted/war-for-the-overworld.git/
git gc

Running a git gc  will cause git to remove any loose files and garbage as well as compress all the file blobs. This might take awhile…

Once done, you simply run the below command to upload it to the remote host:

git push --mirror https://USERNAME:[email protected]/BrightrockGames/war-for-the-overworld.git

If your repo is <1GB in size, this should complete without much of an issue and you can walk away from this article.

If you’re in my position, where the resultant repo is still fairly large (Ours is 2.5GB), then you’re going to need to enlist the help of GitHub support to temporarily relax 2 key limits:

  • GitHub has a 100MB total file size limit – We hit that with some of our old Unity scene files and I didn’t want to put them into LFS as they’re diffable plain text files, and doing so blew up the LFS store by a factor of 5.
  • GitHub has a 1GB single-push limit – Since our repo is well over 1GB and that Git does not provide a way to push a mirror on multiple parts, we could not push our repo without being kicked-off by GitHub when we hit 1GB+ uploaded.

Thankfully GitHub support have been accommodating to us and have relaxed these limits for the purposes of getting the repo up there.

Extras

  • You can update an existing local mirror with git remote update .

Creating a cutout shader for doors and windows

Posted by & filed under Programming, Unity3D.

For the game I’m currently working on is a 3D isometric management game where you build rooms on a spacestation. The rooms and spacestation are made up of a series of tilesets so the player can effectively paint out the rooms in any size and shape they wish and the wall tiles are snapped together and appropriately themed based on the room type. All relatively easy stuff at this point. However we also allow the player to place doors and windows, and these objects may be multiple tiles wide. The first solution that came to us was seeing whether we could complexify the tileset system a little to support sub-tilesets that are made up of chucked up doors and windows. After some quick math this didn’t look good – The tileset requirements and combinations were quickly exploding into really big numbers. This could have been possibly mitigated by cutting up the tiles even further and then having some automated tileset builder step that constructs the combinations required. But that sounded no good. There must be another way!

2016-07-29_19-24-28

Read more »

Improving the FBX workflow between 3ds Max and Unity3D

Posted by & filed under Uncategorized.

As some of our artists swear by 3ds Max we must support it and ensure the workflow between it and Unity3D is as seemless as possible. Unfortunately even though 3ds Max and among the top few 3D modelling applications there are still some serious workflow issues with it. One of the common tasks one performs – Exporting 3D models to the FBX file format – is fraught with problems:

  • 3ds Max uses Z-Up axis whilst Unity3D is Y-Up. The FBX Exporter doesn’t do this properly and Unity3D detects this and adds a -90 transform rotation to the imported object.
  • Before exporting each object, the artist must zero out the position so it doesn’t import out of [0,0,0].

Having to do this for many objects can be extremely time consuming for the artist. We need something that can automate this! And batch this!

Thanks to a lot of work by Jos Balcaen I have been able to modify and improve his batch exporter to support a few Unity oddities.

Download the MaxScript here.

2016-07-14 18_30_04-Clipboard

Why is Physics.UpdateBodies using up so much time? 20ms to 2ms

Posted by & filed under Game Development, Programming, Unity3D.

As part of performing some initial prototyping and load testing for our next game we had to determine how to implement the physics system. For our game we required a dynamic world and the ability to simulate 100-500 agents. That kind of target is right on the edge of needing to write a bespoke system so as to take every advantage one can to meet a 60 FPS minimum. We’re long-time Unity3D devs and were quite wary of its implementation of PhysX, but we had to try it anyway – The tooling and maintenance bonuses of using it would be of great value let alone not having to write our own implementation.

Our test consisted of a single large floor collider and the “Unity Guy – Ethan” rigged model and animation assets (Rigidbody + Capsule Collider components), with a simple script that has him run around a couple of rally points. No pathfinding. We would spawn 500 of these agents and see what the frame times look like with them all within the camera frustum and, more importantly here, when they’re outside of the camera frustum.

There were a whole bunch of small optimizations we tweaked and played with, however there was one striking issue that really stumped us:

2016-07-01 18_40_29-

What the fuck? 20ms?! Wow Unity’s physics system is SO. SHIT!

Actually. It’s not. It took the better part of a day and a Unity Dev to explain in way too few words what this magical Physics.UpdateBodies method was:

“it’s sending all the transform update messages.”

Huh. Okay. So after digging around for a bit it turns out the problem is this: For whatever reason, this method deals with sending back the transform updates of every transform under a Rigidbody. The model we were testing with has, as with all imported rigged models, its default skeleton structure laid out in 20-30 empty GameObjects. It’s a well known optimization that one must trim these as much as possible and it’s something we would do when we got around to really implementing animated characters. However it seems quite insane for it to impact Physics.

In either case the solution is simple – Remove the dead transforms (You can read about optimizing rigged models here) and…

2016-07-01 18_48_49-

20ms to 2ms. Crazy. Simulating 500 or even 1000 active agents is actually possible right now. I wonder how many Unity developers dump the physics system without realizing this optimization.

Important Tips on Hiring a Video Games Writer

Posted by & filed under Game Development.

I’m Scott Richmond, the Producer and a programmer at Brightrock Games. I am lucky enough to have been able to go through the experience of Kickstarting and, as of April 2015, successfully releasing our game War for the Overworld. WFTO had a considerable amount of writing in it as well as voice acting, and in this article I will be taking this experience and discussing how we aim to tackle the writing department with our new game title. Just before I start, I’d like to thank Chris Avellone for taking the time to discuss with me the inner world of video game writers.

Read more »

How to Improve the Performance of Unity3D Animations

Posted by & filed under Uncategorized.

In our game War for the Overworld we make pretty heavy use of animations with users being able to potentially see 25-100 units on screen at once. This means that the Animation Renderer takes up a considerable part of our frame time, and as such any tuning there can create a noticeable performance improvement. Recently I decided to take another look and found what I think is a lesser known Unity feature – Optimize Transform Hierarchy. Using this feature reduced our Animation Renderer overhead by 50%!

2015-09-21 11_46_14-

Executing this feature on an animated prefab will cause it to remove all those empty bone Game Objects.

Why might you want to do this? Because it costs precious time for Unity to move those Game Objects along with the actual animations. It does not affect the animations themselves. But do note that this will mean you cannot attach arbitrary child objects to your model – For example a weapon to its hand and have it correctly follow the animation. Also be aware that it will delete any Game Objects you might have put somewhere into the bone hierarchy. This can be solved with a bit of work, however.

Option 1 – Set everything up manually again

Often you’ll want at least some of the bone Game Objects available for use, such as the hands and head. You can manually dictate what bones Unity will create a Game Object for in the Rig tab for the model importer.

2015-09-21 11_59_00-Skype™ [1] - scott@strichnet.com

Note that this wont automatically update the model prefab if you’ve already made one, so you’ll need to recreate the prefab with the new mesh.

Option 2 – Use a script to automate it!

We have some 70 units in our game, so manually updating all the prefabs was a going to be a bad idea. I created a Unity Editor script that takes a set of prefabs to be consumed. It will search each prefab for a Skinned Mesh, flag any bones that contain a custom Game Object, add them as bones that need to stay, then finally it will optimize the mesh. It worked on our messy unit prefabs so hopefully it’ll be fine for you too, however it might not cover all edge cases so make sure you double check everything is good afterwards.

Tuning Git for large binary repositories

Posted by & filed under Uncategorized.

Git isn’t particularly well suited for video game repositories where you want to version binary files such as art assets and at some point it will start to crack at the seams. Currently there is no 64bit Git build for Windows and at some point your repository will begin to crash with Out of Memory exceptions when performing heavy tasks such as Git GC.

Until a 64bit build is released for Windows one can make the following changes to ensure Git doesn’t hit the 32bit memory limit.

In .git/config add the following:

[core]
  packedGitLimit = 512m
  packedGitWindowSize = 512m
  bigFileThreshold = 256m
[pack] 
  deltaCacheSize = 256m
  windowMemory = 256m
  threads = 4

In .git/info/gitattributes we will set a number of file extensions to binary and remove the delta diff functionality from them:

*.jpg binary -delta
*.ogg binary -delta
*.png binary -delta