Very simple remote logging.

I thought up a very simple way to get Nginx to handle remote event logging over http using Nginx’s custom log formats. You can format the logs as CSV and use them elsewhere.

Insert the following within http block (outside the server block):

log_format custom '$time_local, $var1, $var2, $var3, $varN';

Add a location into the server block to handle the custom logging:

    location ~/log/(?<var1>([^?]+))/(?<var2>([^?]+))/(?<var3>([^?]+))/(?<varN>([^?]+)) {
        access_log /var/log/nginx/custom.log custom;
        return 200 'ok';
        add_header Content-Type text/plain;
    }

Restart Nginx.

Hit the url from where ever and insert your values in the path: https://example.com/log/value1/value2/value3/valueN

You should see the csv entry in the custom log file.

Finally, download the log file and import somewhere as a CSV.

I wrote a simple bash script to download the most recent csv and display it in the default spreadsheet software (on a Mac)

#!/bin/bash

# grab the most recent log file
rsync [email protected]:/var/log/nginx/custom.log ./custom.csv

# open it (on a Mac)
open custom.csv

Convert Equirectangular HDRI to Face List

Here is a fast workflow to get an equirectangular HDR image into a format suitable for an hdr environment map in Three.js.

We will be using a piece of open source software called cmftStudio (cube map filtering tool). So head over there and download & install the correct binary.

Next grab yourself an equirectangular HDR image. There is a fantastic site dedicated to these over at https://hdrihaven.com/. I am using the Colosseum in this example.

Open cmftStudio. For this process, you’ll only need to focus on the right side toolbar. Under the environment tab, select the Edit button. Another pane, titled ‘Environment’, will slide in.

The first section in this pane is ‘Skybox’, under this select the ‘Browse…’ button, find your .hdr file then click load. You should now see this load as the environment map of the preview scene. You can right-click and drag to orbit around.

We need to choose a resolution for our faces. HDR files can get a bit hefty Under the skybox section, click info. This will slide in a third pane, titled ‘Skybox’. Look for the resize section. Choose a resolution that will be applied to each face and click ‘Resize’.

Now we need to export this as a face list, ready for use in an instance of CubeTextureLoader in Three.js.

Close the ‘Skybox’ pane and select ‘Save’ back in the pane 2.

The save dialogue will now slide in. Select .hdr as your file type and FaceList as your output type. Browse to a directory where you would like the files outputted, then click save.

You now have your face list as an array of hdr files ready to import into Three.js

Follow this example on the Three.js site to use these as an environment map in your scene.

Firefox Chromeless Fullscreen

This is a solution to get Firefox (Quantum) working chromeless in fullscreen from startup. This would be useful for setting up a browser for a kiosk.

Create a userChrome.css file:

https://www.howtogeek.com/334716/how-to-customize-firefoxs-user-interface-with-userchrome.css/

Paste this in to the css file:

#navigator-toolbox, #titlebar { display:none; }

Useful to have installed for any other tweaks to the browser style… https://developer.mozilla.org/en-US/docs/Tools/Browser_Toolbox

Add this extension to Firefox:

https://addons.mozilla.org/en-GB/firefox/addon/autofullscreen/

Create a launch agent to fire up the browser to a specific url at startup:

Launch browser on startup (MacOS):

Create a launch agent that runs a bash script at startup that executes:

#!bin/bash
/Applications/Firefox.app/Contents/MacOS/firefox https://example.com

https://developer.apple.com/library/archive/documentation/MacOSX/Conceptual/BPSystemStartup/Chapters/CreatingLaunchdJobs.html

Multi-touch on MacOS in the Browser

I am faced, yet again, with making multi-touch work in the browser on MacOS. This has always been a tricky one to solve. There have been working solutions in the past but the ones I can find are broken in one way or another and have become dusty and forgotten. Probably because most people decide to use windows no doubt. So I have come up with a workaround.

This is something straight forward on windows without this convoluted setup, but on Mac multitouch (not multitouch gestures) is non-existant by default. Here is a solution using a Touch-Base driver, UPDD-TUIO, Tuio.js and some hacks that get the broken Tuio.js library working again. This all works as of 02/01/2019. This should probably work on Windows and Linux, but this focuses on MacOS.

Firstly, Head over to Touch-Base and follow steps to work out which driver you need.

Once you have this, you may need to get in touch with Touch-Base directly to get hold of the newest driver that is packaged up with the UPDD-TUIO middleware. They may have it as an option since this post. (The old version of UPDD-TUIO refused to launch on my machine so I have no idea if it works.)

Once you have this up and running, calibrate your display and run the test to make sure it is setup correctly. Make sure UPDD-TUIO is set to start at login if required.

Edit an existing / create new NPM repo and add these dependencies and resolutions to the package:

  "dependencies": {
    "Tuio.js": "https://github.com/jamiemarkwhite/Tuio.js.git",
    "express": "3",
    "jspack": "^0.0.4",
    "socket.io": "0.9"
  },
  "resolutions": {
    "socket.io/policyfile": "0.0.6"
  }

Run the server in node_modules/Tuio.js/src/server.js

The resolutions entry is something I came up with to solve an incompatibility with tuio.js, express and socket.io.

When this is running, you can test a simple canvas example implementation. Run a web server at node_modules/Tuio.js and hit http://host:port/examples.html and hopefully you’ll be able to see some blue blobs on the screen representing your touch points.

I am currently working on converting the TUIO events received in the browser and converting them to TouchEvents. I’ll post an update on that.

This is a very quick and sketchy post, for which I apologise. Improvements later maybe.

Smooth Shading in Blender / GLTF / THREE.JS MeshPhysicalMaterial

This is how you get smooth shading on your meshes from Blender -> GLTF -> THREE (with a MeshPhysicalMaterial)

 

1. Export your mesh with smooth shading toggled on in the object’s Tools -> Edit menu.

Smooth shading

2. Make sure material.flatShading=false  is set on your material.

3. If you still have problems, see this https://github.com/mrdoob/three.js/issues/347

Fix timezone settings for PHP on Mac OSX El Capitan for Symfony

Symfony requires you to specify a date.timezone in the ini file. The problem is, Mac uses it’s own install of PHP, and you need to create directory structure it searches for the ini file… and a php.ini file. Here’s how you do it.

1. Open terminal
2. nano /Library/Server/Web/Config/php/php.ini
3. add line date.timezone = “Europe/London” (Other timezones)
4. Save & exit.

The original error:

[Symfony\Component\Debug\Exception\ContextErrorException]
Warning: date_default_timezone_get(): It is not safe to rely on the system's timezone settings. You are *required* to use the date.timezone setting or the date_default_timezone_set() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected the timezone 'UTC' for now, but please set date.timezone to select your timezone.

MSAA (Anti-aliasing) on FBO’s in Cinder

Framebuffer objects in Cinder (by default) have no anti-aliasing which leaves your edges jagged.
The solution is to create a Framebuffer format and specify the number of samples to use.

gl::Fbo::Format msaaFormat;
msaaFormat.setSamples( 4 ); // enable 4x MSAA
gl::Fbo myMsaaFbo( 640, 480, msaaFormat );

More info on FBO’s and anti-aliasing

OpenGL / Cinder – Custom Dynamic Attributes Example (GLSL 1.2)

Here’s an example of using dynamic custom attributes in GLSL 1.2 and Cinder.
I also found this link useful!

void makeVBO()
{
	gl::VboMesh::Layout layout;
	layout.setDynamicPositions();
	layout.addDynamicCustomFloat();  // add a custom dynamic float
	

	vboModel = gl::VboMesh(NUM_VBO_VERTICES, 0, layout, GL_POINTS); // create vbo mesh with dynamic positions
	
	mShader.bind();
	GLuint loc = mShader.getAttribLocation("myAttribute"); // get location of "attribute float myAttribute" in vertex shader
	vboModel.setCustomDynamicLocation(0, loc); // set the local 'id' of the attribute
	mShader.unbind();

	int distFromCenter = VBO_RADIUS;

        // generate random vertices
	vector<Vec3f> vPositions;
	for (int j = 1; j < NUM_VBO_VERTICES; ++j)
	{
		vPositions.push_back(
			Vec3f(-(distFromCenter / 2) + randFloat()*distFromCenter,
			-(distFromCenter / 2) + randFloat()*distFromCenter,
			-(distFromCenter / 2) + randFloat()*distFromCenter)
			);
	}

        // iterate through vertices
	int i = 0;
	gl::VboMesh::VertexIter iter = vboModel.mapVertexBuffer();
	
	for (int idx = 0; idx < NUM_VBO_VERTICES; ++idx) {

		// set position of vertex
                iter.setPosition(vPositions[i]);
		
                // set the value of 'myAttribute' to a random number
                iter.setCustomFloat(0, 10+Rand::randFloat()*90);
		++iter;
		++i;
	}

}

void draw()
{
        mShader.bind();
        gl::draw(vboModel);
        mShader.unbind();
}

Hundreds of Thousands of Particles at 60 fps

Recently been getting to grips with Cinder, OpenGL and GLSL for a project. I managed to get close to a million particles drawing at around 60fps

Here’s what I did to achieve this:

Firstly I created a VBO mesh that contained randomly positioned vertices (each vertex represented a root particle position).

Using VBO meshes mean you don’t have to upload the geometry before every draw call- it  gets uploaded to the GPU once, then you can transform and draw the referenced mesh in your render loop.

Once i had the VBO mesh, I created vertex and fragment shaders that accept GL_POINTS as the input and output types. This means the vertex shader is expecting a bunch of vertices and the fragment shader is expecting to draw a single points to the screen.

The clever thing is, you can flick a switch in OpenGL to enable point sprites, this means you can tell the fragment shader to draw a texture at the point instead.

You can also tell the vertex shader what size each point sprite should be rendered at based on its distance from the camera

Using point sprites is fast, because there are no triangles or textures that need mapping to them- the fragment shader just renders the texture at a given screen position and at the size you have specified.

FAST FAST FAST!!!!!

Removing a file from git history

BE VERY CAREFUL!

This will go through ALL of your commits and remove all references to your big file you accidentally committed way back.

1. Go to your repository in terminal.
2. paste this line in, but substitute bigfile.psd to whatever you want to remove.

git filter-branch --index-filter 'git rm -rf --cached --ignore-unmatch bigfile.psd’ -- --all

Source: http://dangodesign.net/2014/02/remove-large-file-git/

AIR & Away3D 4.0, Dancing Monkeys, Retro Viruses and a Bubbling Flask

We were able to pull off silky smooth 3d animations and multi-marker AR with the help of Away3D 4.0 (which uses the Stage3D API in AIR for GPU accelerated graphics); FLARToolkit; the dab hand of a 3D artist and some upbeat disco music. Not forgetting days I spent working out a consistent workflow to get textured 3d models with skeleton animations from maya to Away3D. Fun times!

Over The Air AdHoc Distribution with Xcode 4.6

At developer.apple.com

1. Create App ID (this may be wildcard, so you can use it for multiple apps with the same bundle prefix. e.g. com.wehaverhythm.* for all internal apps.)
2. Create an AdHoc provisioning profile using the above App ID. (More devices can be added later, but must be added before distribution export).

In Xcode
3. In Xcode’s organiser, refresh your provisioning profile. (make sure you refresh each time the profile has been amended online, like when you add more device id’s)
4. Click Product -> Build For -> Archiving.
5. Click Product -> Archive.

[When archive is complete, it will appear in Xcode’s organiser under ‘Archives’.]

6. Select the application, and archive for that application you want to distribute for AdHoc.

7. Click ‘Distribute’.

8. Select ‘Save for Enterprise or Ad-Hoc Deployment’

9. Click ‘Next’.

10. Select the correct provisioning profile from the ‘Code Signing Identity’ drop down.

11. Click ‘Next’, again.

12. …wait for as long as it takes…

13. MAKE SURE YOU CLICK ‘Save for Enterprise Distribution’! Even for AdHoc. This drove me insane.

14. Enter the URL where the application will be hosted. The absolute url, including the filename.

15. Give it a title. Fill in the image url’s if you need them.

16. Save the *.ipa somewhere you will find it.

[You will notice this will also save an accompanying *.plist file]

In a text editor
17. Create an html file like the one below:

<title>Install AdHoc Distribution</title>

<a href="itms-services://?action=download-manifest&amp;url=http://example.com/MyApplicationName.plist">Install My AdHoc App</a>

FTP
17. Upload your *.ipa, *.plist files to the url specified in the export process, earlier. Upload your html file on a web server.

On your authenticated devices
18. In your web browser on one of the authenticated devices, goto the html page you created and click the hyperlink to your app.

19. INSTALL IT!