Method for finding the centre of the cell and rad/tan orientations for Gen 2 OPMs
The schematic for the sensors can be found here. 3D models of the sensors have been positioned on the scalp of the person being scanned. This is done externally and we are given a 3D model (stl file) of the scalp and sensors. It looks like this:
The code in this guide is mostly from our GitHub (here). It uses properties in the 3D models to find the centre of the cell and the rad and tan orientations, as per the schematic. Find the centre of the cell and orientations for a single sensor.
First setup paths and config. This format is just a layover from the function that is normally used here.
clear dataPath outputsPath templatesPath
cfg.folder = strcat(inputsPath,'\Scannercasts\NA\Individual STL');
cfg.outputfolder = strcat(inputsPath,'\Scannercasts\NA\');
cfg.scalp = strcat(inputsPath,'\Scannercasts\NA\NA Head Clean.stl');
Find the STL files
STLdir = strcat(cfg.folder);
sensorListing = dir([STLdir '\*.stl']);
if isempty(sensorListing)
sensorListing = dir([STLdir '/*.stl']);
For this demo, only look at the first sensor. Again, just a layover from the function where this was in a loop.
Read in the STL file for that sensor.
fprintf('Extracting information from .stl file %3d of %3d\n',...
sensorIdx,size(sensorListing, 1));
Extracting information from .stl file 1 of 76
sensorFilename = fullfile(sensorListing(sensorIdx).folder,...
sensorListing(sensorIdx).name);
[faces, verts] = stlread(sensorFilename);
filename{sensorIdx} = sensorFilename;
Find the centre of the cell first.
Remove duplicate vertices from the STL file. This will help later on. For example, it avoids there being duplicate corners which would prevent using rules like only having 8 corners.
[redVerts,~,~] = unique(verts,'rows');
% Face references need to be updated to match them.
redFaces = zeros(size(faces));
for redVertIdx = 1:length(redVerts(:,1))
vertRepeatIdx = find(ismember(verts,redVerts(redVertIdx,:),'rows'));
faceRepeatIdx = find(ismember(faces,vertRepeatIdx));
redFaces(faceRepeatIdx) = redVertIdx;
We can see that there is a cone on the top and bottom of each sensor. This marks the centre of the cell as per Gen 2 spec.
patch('Faces',redFaces,'Vertices',redVerts,'FaceAlpha',0,'EdgeAlpha',0.2);
title('Reduced vertices model.')
The cone vertices can be found by looking for the top two most referenced vertices within the faces of the model.
facesList = reshape(redFaces,[numel(redFaces),1]);
[n,bin] = hist(facesList,unique(facesList)); %#ok<HIST>
count = n(idx); % count instances
value = bin(idx); % corresponding values
Fortunately, the most common one is always the bottom and second most is always the top
bottomPoint = [redVerts(value(1),1),redVerts(value(1),2),redVerts(value(1),3)];
topPoint = [redVerts(value(2),1),redVerts(value(2),2),redVerts(value(2),3)];
Each cone has the same depth (0.2mm I think). The average is the centre.
centrePoint(sensorIdx,1:3) = mean([bottomPoint; topPoint]);
Check that they have been tabled correctly.
patch('Faces',redFaces,'Vertices',redVerts,'FaceAlpha',0,'EdgeAlpha',0.2);
scatter3(bottomPoint(1),bottomPoint(2),bottomPoint(3));
scatter3(topPoint(1),topPoint(2),topPoint(3));
scatter3(centrePoint(1),centrePoint(2),centrePoint(3));
title('Sensor model with labeled cones and centre of cell')
Now find the orientations.
This is a bit more challenging because the model does not have a cone at the ends. Tan is easy (top to bottom cone), but rad is more difficult without a landmark.
Making things more difficult, the shape has internal vertices. We only want the outer boundary.
boundFaces = convhull(redVerts,'simplify',true);
Check it looks okay.
patch('Faces',boundFaces,'Vertices',redVerts,'FaceAlpha',0,'EdgeAlpha',0.2);
scatter3(redVerts(:,1),redVerts(:,2),redVerts(:,3));
You should see the internal vertices are still there, but unused by the boundary faces. Remove unused vertices.
uniqueBoundFaces = unique(boundFaces);
newBoundFaces = zeros(size(boundFaces));
boundVerts = zeros(length(uniqueBoundFaces),3);
for vertIdxIdx = 1:length(uniqueBoundFaces)
boundVerts(vertIdxIdx,1:3) = redVerts(uniqueBoundFaces(vertIdxIdx),1:3);
% Replace all references to the redVerts idx with the
[xIdx, yIdx] = find(boundFaces == uniqueBoundFaces(vertIdxIdx));
newBoundFaces(xIdx(ii),yIdx(ii)) = vertIdxIdx;
There is no simple way to identify and label the ends of the sensor in it's current form. I found this great function (CuboidRANSAC) that can help. It fits a cuboid to points, which, if we take the boundary points, will give us a simple version of the model. [simpleVerts, cuboidParameters, ~, ~] = CuboidRANSAC(boundVerts);
Iterations completed: 100
Iterations completed: 200
Iterations completed: 300
Iterations completed: 400
Iterations completed: 500
Iterations completed: 600
Iterations completed: 700
Iterations completed: 800
Iterations completed: 900
Iterations completed: 1000
Total Iterations: 1000
% The function makes an empty figure. Close that.
simpleFaces = convhull(simpleVerts,'simplify',true);
Take the simple cuboid dimensions for later
edgeDists = round(cuboidParameters(4:6),4)';
Check that the fit is okay.
patch('Faces',simpleFaces,'Vertices',simpleVerts,'FaceAlpha',0,'EdgeAlpha',0.2);
scatter3(redVerts(:,1),redVerts(:,2),redVerts(:,3));
title('Simple cuboid with original vertices overlaid')
With this we can get the orientation along the long edge. Find groups of the small faces.
longestSide = max(edgeDists);
edgeDists(edgeDists == longestSide) = [];
Take a vertex and find the connected vertices.
firstGroupCornerVerts = zeros(length(simpleVerts(:,1))/2,length(simpleVerts(1,:)));
remCornerVerts = simpleVerts;
This loop draws a square with pre-specified lengths between each corner.
for cornerIdx = 1:length(firstGroupCornerVerts(:,1))
firstGroupCornerVerts(cornerIdx,:) = remCornerVerts(cornerIdx,:);
remCornerVerts(1,:) = [];
dist = zeros(length(remCornerVerts(:,1)),1);
for remIdx = 1:length(remCornerVerts(:,1))
dist(remIdx,1) = pdist([firstGroupCornerVerts(cornerIdx-1,:);remCornerVerts(remIdx,:)]);
% Long or short edge this time?
nextCornerIdx = (dist == edgeDists(nextDist));
firstGroupCornerVerts(cornerIdx,:) = remCornerVerts(nextCornerIdx,:);
remCornerVerts(nextCornerIdx,:) = [];
The remaining corner verts are the second group.
secondGroupCornerVerts = remCornerVerts;
Check they make sense.
scatter3(firstGroupCornerVerts(:,1),firstGroupCornerVerts(:,2),firstGroupCornerVerts(:,3));
scatter3(secondGroupCornerVerts(:,1),secondGroupCornerVerts(:,2),secondGroupCornerVerts(:,3));
patch('Faces',redFaces,'Vertices',redVerts,'FaceAlpha',0,'EdgeAlpha',0.2);
title('Small face corners marked on sensor model');
Take the mean position of each grouping.
firstGroupCentreVert = mean(firstGroupCornerVerts,1);
secondGroupCentreVert = mean(secondGroupCornerVerts,1);
The centre of the cell is closest to the flat end of the sensor. Which is closer to the centre of the sensor?
firstGroupCentreDist = pdist([firstGroupCentreVert;centrePoint(sensorIdx,1:3)]);
secondGroupCentreDist = pdist([secondGroupCentreVert;centrePoint(sensorIdx,1:3)]);
if (firstGroupCentreDist < secondGroupCentreDist)
flatEndVerts = firstGroupCornerVerts;
flatEndCentreVert = firstGroupCentreVert;
cableEndVerts = secondGroupCornerVerts;
cableEndCentreVert = secondGroupCentreVert;
elseif (firstGroupCentreDist > secondGroupCentreDist)
flatEndVerts = secondGroupCornerVerts;
flatEndCentreVert = secondGroupCentreVert;
cableEndVerts = firstGroupCornerVerts;
cableEndCentreVert = firstGroupCentreVert;
If the sensors are in 'tan' orientation slots, then that means the tan of the sensor is aligned with the rad of the head. Need to avvound for that.
if strcmp(cfg.slotori,'rad')
radOri(sensorIdx,1:3) = cableEndCentreVert - flatEndCentreVert;
tanOri(sensorIdx,1:3) = topPoint - bottomPoint;
elseif strcmp(cfg.slotori,'tan')
radOri(sensorIdx,1:3) = topPoint - bottomPoint;
tanOri(sensorIdx,1:3) = cableEndCentreVert - flatEndCentreVert;
Normalise the orientations.
L = sqrt(radOri(sensorIdx,1)^2 + radOri(sensorIdx,2)^2 + radOri(sensorIdx,3)^2);
radOri(sensorIdx,1:3) = radOri(sensorIdx,1:3)/L;
L = sqrt(tanOri(sensorIdx,1)^2 + tanOri(sensorIdx,2)^2 + tanOri(sensorIdx,3)^2);
tanOri(sensorIdx,1:3) = tanOri(sensorIdx,1:3)/L;
Look at the result
First, the centre and orientation of the sensor we have been working on.
% For illustration make the arrows bigger first.
quiver3(centrePoint(:,1), centrePoint(:,2), centrePoint(:,3),...
radOri(:,1).* 20, radOri(:,2).* 20, radOri(:,3).* 20);
quiver3(centrePoint(:,1), centrePoint(:,2), centrePoint(:,3),...
tanOri(:,1).* 20, tanOri(:,2).* 20, tanOri(:,3).* 20);
scatter3(centrePoint(:,1), centrePoint(:,2), centrePoint(:,3));
patch('Faces',redFaces,'Vertices',redVerts,'FaceAlpha',0,'EdgeAlpha',0.2);
title('Single sensor with centre and orientations');
The main function, extractSensorPositions.m, runs this in a loop for all sensors with some additional plots.