Moving house

Check it out: http://wspr.io

So smooth and shiny.


Surf-plotting scattered data in Matlab (or: Delaunay interpolation without a grid)

Matlab has a number of methods for interpolating data, both for data that is sampled on a regular grid and for data that is “scattered”, or randomly distributed.

Surface plotting

Plotting surfaces over grid points is easy using Matlab’s surf command, and interpolation of that data to get smoother plots is straightforward. For example,

[x,y,z] = peaks(10);
surf(x,y,z);

will plot:

peaks

Generally I recommend avoiding 3D plots, so in 2D (view(2)):

peaks2d

The variables x and y are 10×10 matrices defined by (the equivalent of) [x,y]=meshgrid(linspace(-3,3,10)), and z is the value at each point in (x,y) space.

If you have sampled data with non-uniform spacing, however, it’s not as obvious how to plot that data. In this case your data would be organised with x, y, and z as column vectors with each measurement per row. The structure of the surf command simply doesn’t handle data in this form, since the data isn’t organised in a way that allows adjacent points to be connected.

Matlab provides commands for analysing this data using Delaunay Triangulation, for which it also provides an analogue to surf. Here’s an example:

N = 15;
rng(13); % repeatable random values
x = randn(1,N); % X & Y coords
y = randn(1,N);
z = randn(1,N); % "value" at each coordinate

After a little digging, you’ll find that the easiest way to plot this data as a surface is:

trisurf(delaunay(x,y),x,y,z)

Seems slightly redundant, but easy enough.

Surf plots with coarse grids

The problem is that surface plots are poor at visualising data over coarse meshes, since it’s their corners which define the values. This can be seen in the following example, using peaks to plot a surface:

meshgrid

The first row shows plain surf. It can be seen that the colours of each “patch” don’t do a good job of representing their height (or “value”). In the second row, Matlab’s interpolating shader is used to “improve” this. It’s an improvement, but distortions are obvious where there are large changes in both directions. (Colours are smeared along diagonals, basically.)

Finally, in the third row an interpolation function is used to generate more data points. This is quite a coarse interpolation, but couldn’t go finer without the edges taking over in the 3D plot. The plot looks far more convincing. Gridded interpolation can be done in a number of ways; for this example I used

F = griddedInterpolant(x',y',z');
[x2,y2] = ndgrid(linspace(-3,3,20));
surf(x2,y2,F(x2,y2))

(The transpose being necessary from coordinates output from peaks. Good old Matlab.)

Surf plots with coarse scattered data

But what about scattered data? While Matlab does provide scatteredInterpolant, this form is only really convenient if you want to interpolate onto a regular grid. What if you just want to improve the look of your overly-coarse surface plot? Here’s the example using trisurf described above, with and without interpolated shading (shading interp):

delaunay

Note that without the interpolated shading, the plot is basically unusable. But even with interpolated shading, there’s significant distortion along certain diagonals. I’ve left out the edge lines in the bottom-right graph to emphasise this. If we resampled this function over a regular grid using scatteredInterpolant, we’d end up with points outside the original region, which might not always be appropriate. We could then delete the outsiders, but that would require additional processing. Wouldn’t it be nice to simply resample within the original?

Delaunay interpolation

To do this requires a couple steps. First, extract the Delaunay Triangularisation of our points:

DT = delaunay(x,y);

This function returns a list of 3xM indices into x and y, where M is the number of triangles. So if we extract the coordinates for each triangle and average them, we obtain the mid-point of each, including an interpolated value:

x2 = mean(x(DT),2);
y2 = mean(y(DT),2);
z2 = mean(z(DT),2);

This figure shows the grid that results after interpolating once and re-applying the Delaunay Triangularisation: (black before, red after)

del01

Note that the Delaunay algorithm doesn’t simply divide up the triangles into smaller components. We can then perform this iteratively to interpolate to an arbitrary level of detail:

delinterp

And after removing the grid lines and increasing the order, the results look pretty nice:

delinterp2

Interpolation will never solve the problem of under-sampling if there’s information missing, but it definitely helps to draw more pleasing figures.

Although pretty basic, I’ve uploaded this code to the Matlab file exchange under the name delaunay_surf, and added it to my generic Matlab repository on Github: https://github.com/wspr/matlabpkg. It’s pretty basic:

p = delaunay_surf(x,y,z,'N',4);

with a few additional options. Check out the example file for more, er, examples.

Happy plotting!

Advertisement

Git support in Matlab

I have a love/hate relationship with Matlab, the details of which I probably obliquely mention here from time to time. The biggest reasons that I stick with Matlab are (a) I have to teach my students to use it, and (b) for all its warts, its IDE is quite nice for code authoring/debugging, exploring data sets, and even drawing somewhat sophisticated 3D diagrams.

I’ve just installed Matlab 2015a on my laptop, and am pretty happy to also see that Matlab now supports Git: (this appears to have been added in 2014b, so I’m a little behind the times)

Screen Shot 2015-05-09 at 1.32.57 pm

I usually use a lightweight GUI to perform simple Git operations (GitX on Mac) and the command line for anything more sophisticated. Most common Git operations can be performed from a clunky interface that involves right-clicking on a file, selecting “Source Control” from the menu, and then selecting from a list. Rather like TortoiseGit on Windows, I guess. There are a few impedance mismatches, such as the term “Compare to Ancestor”, which is Matlab’s version of performing “git diff” to see what you’ve changed in the current file. I think trying to abstract multiple version control systems under a consistent set of Matlab commands is probably a recipe for disaster, long term, but I can understand the motivation.

I like that the Git status is indicated in the file listing, but I suspect I’ll rarely, if ever, use the Matlab interface to actually perform source control. I’m a little disappointed that there isn’t a better user interface than delving into contextual menus and receiving little more than plain info in pop-up windows. For example, what I’d like to see would be to double click a “changed” icon (blue square in the above screenshot), which would show the current differences from the last commit, and an integrated text field and nice big button to commit the changes. Not as easy, but more likely to get keep people within Matlab.

What I’m really hoping Git support in Matlab makes it possible to slowly introduce my undergraduate students to Git (engineers learning source control is a subject I hope to discuss another time), and perhaps more importantly I hope it will help facilitate collaboration with research students that work on my code. Fingers crossed.


Elliptic integrals in Matlab: symbolic toolbox is slow

My real work in magnetics involves evaluating sometimes complex integrals that often result in solutions that include the elliptic integrals. These are a funny set of functions that I’ve discussed before.

Matlab has historically only included the bare minimum here: in-built function to calculate the first and second complete elliptic integrals. Hence Igor Moiseev’s valuable work in this area (now moved to Github).

Those with a variety of Matlab toolboxes, however, will usually run into the newish functions provided by the symbolic toolbox ellipticF, ellipticE, and friends. (Quite an impressive collection, actually.) However, if you actually wish to use these functions for something computationally intensive, such as numerical integration, it’s a bad idea to use them. Why?


>> tic; ellipticF(0.5,0.5), toc
ans =
    0.5105
Elapsed time is 0.020991 seconds.

That’s a small fraction of a second — seems fast! Compare with Igor’s native Matlab function:


>> tic; elliptic12(0.5,0.5), toc
ans =
    0.5105
Elapsed time is 0.000705 seconds.

So the symbolic toolbox is fine if you just need to evaluate a value or two. But for anything iterative — and this holds for algorithms in general, not just for my own example of numerically integrating elliptic integral functions — you’ll want to avoid the overhead of switching to the symbolic toolbox mid-calculation.

(This is an example of why I find Mathematica a more enjoyable computational environment; no alien toolboxes.)


Avoid Excel vlookup: use index+match instead

This is I’m sure old news to Excel gurus out there (of which I’m reluctantly, seemingly, turning into).

I started having a little more appreciation for Excel when I began using vlookup more seriously; this allows a semi-structured way of dealing with linked data in Excel. E.g., when a list of grades has a different set of students than a class list, you can still sensibly lookup values without manually copy/pasting information from one to the other.

I suppose an actual example is in order:

Screen Shot 2014-12-02 at 5.18.22 pm

I hope this is fairly obvious; we have an extremely basic lookup-style functionality (hence the name) where the function looks up "c" in the first column and outputs the corresponding entry in row 2. You’d obviously usually use a reference to another cell rather than hard-code the "c" entry in there.

So this is great, but comes with some problems.

First of all, vlookup of course has the incredibly frustrating default behaviour of not exactly matching entries; in my own experience, things never work out unless you write it in the form vlookup(•,•,•,FALSE) — and if I’m rusty I never remember whether I should be writing true or false as the argument in there. So that’s one reason vlookup is dumb.

Second of all, vlookup breaks in a completely transparent manner if you ever alter the columns in your original query. Perhaps I am re-arranging the spreadsheet so that there is a grade column as well:

Screen Shot 2014-12-02 at 5.22.53 pm

Unlike what happens in a lot of other Excel cases, inserting the new column hasn’t updated the vlookup equation (not saying it should, though). And our lookup now obviously returns different data. This can bite you pretty bad unless you have stringent error checking somewhere along the line.

On the other hand, the useful replacement for vlookup does NOT have this same problem. This solution is called something like index+match instead, and you should not be intimidated from using it because it’s slightly more complex. In fact, I find it easier to use in most cases:

Screen Shot 2014-12-02 at 5.28.47 pm

Translating that formula into prose produces something like this: “index into column D the row that matches “c” in column C”. Note again Excel’s insistence that you need a dumb suffix to tell match to do its thing properly; don’t forget the match(•,•,0).

Why is index+match better? I’m glad you asked:

Screen Shot 2014-12-02 at 5.32.07 pm

As soon as we inserted the extra column, the match function automatically updated and our formula is still correct. You can argue more tenuous advantages about index+match (e.g., that vlookup can’t have a negative column index), but automatic updating is the killer.


Expanding body of knowledge

Expanding body of knowledge

This image perfectly captures a notion I’ve had floating around in my noggin for years now. Via FrenchKheldar on Stack Exchange


Bones and organs, and everything else, in Matlab

I’ve recently been interested human body modelling in biomechanics. One of the honours projects I’m supervising has a goal of producing a mock lumbar spine, which lead us to a Thingiverse example of exactly that (our will have additional features for attaching fake ligaments and disks, however).

This example linked to the BodyParts3D project, which consists of thousands of 3D models taken from MRI data hand-segmented and manually augmented where necessary using known anatomical information (documented in their paper).

The project consists of an online ‘anatomography’ program, which can be used to browse the database and produce viewable subsets for visualisation, as well as the database of body parts itself. The body parts are in the OBJ format, which I know little about but seems straightforward enough, and there are two resolutions available.

Using the low-resolution models, I’ve imported the bones of the model into Matlab using Dirk-Jan Kroon’s Wavefront OBJ toolbox; the results are shown below. I’ll tidy up the code and make it public as soon as I can; a few things on my plate before I can do this, however.

Long term, I’d like to make the skeleton move and use the anthropometric data to annotate tendon attachments and insertions on the skeleton to start producing an anatomically correct dynamic simulation of human motion. I know this is being done by OpenSim already and it would be counterproductive to re-invent the wheel, but there is still some scope for some orthogonal research in this area.

 

Image


6DOF load cells using strain gauges

I’ve become interested recently though an honours project by the construction of six degree of freedom load cells using strain gauges. In many cases I believe this information is unpublished since the load cell manufacturers have no desire or need to publish such information. There are some published on various designs, however, and they vary considerably in terms of mechanical design and strain gauge arrangement.

I’ve heard that high-end load cells use up to 32 strain gauges to resolve the six forces and torques. In such arrangements, four or more strain gauges are placed in groups carefully designed to reject off-axis loads. Six directions and four per group make 24, which is the number used by Joo et al. (2002), whose device consists of a steel rectangular shaft with multiple cut-outs to provide mobility for the various strain gauge groups.

A simpler mechanical system that also used 24 strain gauges was developed here at The University of Adelaide, described by Dr Carl Howard in his thesis ‘Active isolation of machinery vibration from flexible structures’.

Still, using 24 strain gauges to measure just 6 output signals could be considered overkill. Bayo and Stubbe (1989) investigated a truss arrangement on the assumption that truss elements can only undergo one main mechanism of deformation, axial strain, and therefore would be much easier to instrument using strain gauges. Their design uses only eight but the truss does not look easy to machine — except now with 3D printing this may no longer be an issue.

Finally, the more mathematically-minded rather than mechanical-minded may question why more than six strain gauges are needed anyway. Surely with a particular geometry and understanding of the strain kinematics, six load signals could be calculated from six strain measurements. Indeed, this idea was investigated by Dai and Kerr (2000) using a Steward platform (also known as a hexapod), in which the ‘legs’ of the platform were tension cables mounted to cantilevered strain gauge beams.

I’m not aware of any work that compares these various approaches for different requirements. The truss-like structures seem promising, but are they structurally rigid enough for dynamic loading situations? It’s intuitive that the devices with fewer strain gauges could suffer from more cross-axis contamination, but if the system is well characterised perhaps that can be well accounted for. The savings in design complexity, cost of strain gauges and the cost associated electronics certainly deserve the comparison.

Bayo and Stubbe (1989) “Six-axis force sensor evaluation and a new type of optimal frame truss design for robotic applications.” Journal of Robotic Systems 6. DOI: 10.1002/rob.4620060206

Dai and Kerr (2000) “A six-component contact force measurement device based on the Stewart platform.” Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering 214. DOI: 10.1243/0954406001523696

J.W Joo, K.S Na, D.I Kang (2002) “Design and evaluation of a six-component load cell.” Measurement 32. DOI: 10.1016/S0263-2241(02)00002-7


“Locomotion of Mexican jumping beans”

You know these people must absolutely love their research. Abstract:

The Mexican jumping bean, Laspeyresia saltitans , consists of a hollow seed housing a moth larva. Heating by the sun induces movements by the larva which appear as rolls, jumps and flips by the bean. In this combined experimental, numerical and robotic study, we investigate this unique means of rolling locomotion. Time-lapse videography is used to record bean trajectories across a series of terrain types, including one-dimensional channels and planar surfaces of varying inclination. We find that the shell encumbers the larva’s locomotion, decreasing its speed on flat surfaces by threefold. We also observe that the two-dimensional search algorithm of the bean resembles the run-and-tumble search of bacteria. We test this search algorithm using both an agent-based simulation and a wheeled Scribbler robot. The algorithm succeeds in propelling the robot away from regions of high temperature and may have application in biomimetic micro-scale navigation systems.


Forces between cylindrical magnets

Deriving an equation to calculate the forces between cylindrical magnets is harder than for cuboid magnets (for which a closed form solution was first published in 1984).

Most recently, Ravaud et al. published (2010) an closed form expression for the force between coaxial magnetic cylinders, for which I’ve currently a paper in publication with a much simpler equation. You can download my code for implementing this in my repository of magnet code; the Matlab example is ‘examples/ravaud-cylmag.m’ and the Mathematica example has extension ‘.nb’ with the same name.

The force equation for coaxial magnets contains elliptic integrals, and when analysing the system for eccentric displacements in the radial direction the integral becomes, as far as I know, not possible to solve tractably. I’m aware of two different approaches for calculating the forces for these cases. The first is by Nagaraj in a somewhat-elusive paper from 1988 who simplifies the necessary integrals until only a double-integral that can be evaluated numerically. (A numerical evaluation of a nicely-behaved integral is typically around an order of magnitude slower than if a closed form solution is known.)

A more recent analysis of eccentric magnetic cylinder forces was published in 2009 by Vokoun et al., and to be honest their approach is a little alien to me; I haven’t looked into their maths very much and I don’t know if the implementation would be harder and/or more efficient than the approach used by Nagaraj.

Implementations of these latter two papers would be very useful for us at the moment but I don’t have time to do it myself right now; I have some honours students who might step up to the plate. If so, I’ll add their code to the magcode repository; a common theme with all this code is that a reference implementation saves people so much time when they need to use the theory publicised by these authors.

The DOIs for the papers mentioned above are:

  • Akoun & Yonnet 1984: 10.1109/TMAG.1984.1063554
  • Ravaud 2010: 10.1109/TMAG.2010.2049026
  • Nagaraj 1988: 10.1080/10402008808981815
  • Vokoun 2009: 10.1016/j.jmmm.2009.07.030