You are hereFeed aggregator

Feed aggregator


Automating the extraction of real data from an image of the data – part 2

Matlab Image processing blog - 2014, January 7 - 05:00

I'd like to welcome back my fellow MATLAB Central blogger Brett Shoelson for the second in a three-part series on extracting curve values from a plot. You can find Brett over at the File Exchange Pick of the Week blog, or you can check out his many File Exchange contributions. -Steve

ContentsQuick recap

Continuing from last week's guest post, we're heading down the path of extracting real data from the "efficiency curve" in a graphical representation of those data. To recap, here's the original image:

Here are the steps we need to repeat from part 1

url = 'http://blogs.mathworks.com/images/steve/2013/example-pump-perf-curve.jpg'; chart = imread(url); bw = im2bw(imcomplement(rgb2gray(chart))); filled = imfill(bw,'holes'); cc = bwconncomp(filled); stats = regionprops(cc,'Area'); A = [stats.Area]; [~,biggest] = max(A); filled(labelmatrix(cc)~=biggest) = 0; bb = regionprops(filled,'BoundingBox'); %We'll use this later! bb = bb.BoundingBox; chart(~repmat(filled,[1 1 3])) = 0; curveOfInterest = rgb2ycbcr(chart); curveOfInterest = curveOfInterest(:,:,2);

By the end of the first post, I had managed to reduce the SNR so that I ended up with an image (curveOfInterest) that contains a single region...that partially isolates the two blue curves in the top axes:

Next up, segmentation

Many (perhaps most) image processing problems require some approach to "segmenting" the image--creating a binary mask that shows white ("true," or 1) where we have signal, and black ("false," or 0) elsewhere. Now that I've masked and converted the original image to YCbCr ("luminance/chrominance/chrominance"), and selected the second (Cb) plane of the resulting image, segmentation is relatively easy. I'm going to use SegmentTool to explore options for doing that:

Simple thresholding works well here (try it!), but I'm going to use a different approach--one that sometimes succeeds where thresholding fails.

curveOfInterest = imextendedmax(im2double(curveOfInterest),0.30,8); imshow(curveOfInterest) title('Binary Curves') Morphologically clean the image

For subsequent analysis, and extraction of data, it will be useful to morphologically alter the image. MorphTool will be your best friend in that regard! ;)

After playing around a bit, I decided to dilate the image, and then to skeletonize it and remove spurs. (bwmorph makes both of those latter processes easy.):

curveOfInterest = imdilate(curveOfInterest, strel('Disk',6)); curveOfInterest = bwmorph(curveOfInterest, 'skeleton', Inf); curveOfInterest = bwmorph(curveOfInterest, 'spur', 11); imshow(curveOfInterest)

If the curves in that image don't look contiguous, recognize that that's just a screen-resolution issue. There's now exactly one region in this image:

cc = bwconncomp(curveOfInterest); cc.NumObjects ans = 1 Extracting the curve of interest

And now for the difficult part

Okay, we're well on our way. We've eliminated the black and red curves that we don't need, and we've got the curve of interest reduced to a skeleton that contains the minimum information that we do need. But we still have to find a way to extract only the efficiency curve, and to discard the curve representing "334-mm" of head. We can't use color; the curves in the original image are identical. So we're left with differentiating the curves based on their morphological (shape) characteristics.

I'm going to do this two different ways

Sometimes it's useful to be able to trace the boundary of a region, or (as we do here), to trace along a particular path. bwtraceboundary will allow us to do just that.

BWTRACEBOUNDARY

If you've never used it before, take a moment to read the documentation for bwtraceboundary. You'll find that it allows you to operate on a binary image by specifying a starting point (P), an initial trajectory (fstep) along a path in the image, and a direction (dir) in which you want to trace the path. ("Direction" here allows you to specify the overall orientation of the boundary you want to trace.)

Automating the determination of the starting point

Recalling that the curve of interest always starts in the lower left of the upper axes, we can readily automate the determination of the starting point. (It will be the point closest to the bottom left of the masked axes.) I'm going to use regionprops to determine the bounding box of the masked axes (currently contained in variable "filled"). However, the doc for regionprops indicates that BoundingBox is measured from the upper left; I need the lower left so I'll have to manipulate the calculation of the bounding box a bit:

targetPoint = regionprops(filled,'BoundingBox'); targetPoint = targetPoint.BoundingBox(1:2) + [0 targetPoint.BoundingBox(4)]; [ys,xs] = find(curveOfInterest);

Now it remains to get the distance from all points to the lower left corner of the bounding box, and to find the point that minimizes that distance. (I'm going to use a Statistics Toolbox function here, but this is a relatively easy function to write if you don't have that Toolbox.)

[~,ind] = min(pdist2(targetPoint,[xs,ys])); P = [ys(ind),xs(ind)]; %(Careful here: P is represented as [row,col], not [x,y]!)

Now we need to orient the trace

Consider the results using two different orientations:

N = 1000; %If I leave this unbounded (Inf), all points on the curve will be detected! % Counterclockwise Bccw = bwtraceboundary(curveOfInterest,P,'NE',8,N,'counterclockwise'); imshow(curveOfInterest); hold on plot(P(2),P(1),'go','MarkerSize',12) plot(Bccw(:,2),Bccw(:,1),'r','LineWidth',5); title('Counterclockwise') hold off

Now clockwise.

Bcw = bwtraceboundary(curveOfInterest,P,'NE',8,N,'clockwise'); imshow(curveOfInterest); hold on plot(P(2),P(1),'go','MarkerSize',12) plot(Bcw(:,2),Bcw(:,1),'r','LineWidth',5); title('Clockwise') hold off

That's interesting; I see that if I specify a clockwise orientation for my curve (and if I use a sufficiently large N), I can constrain the boundary-tracing algorith to roughly follow the efficiency curve I'm after. However, there's a significant spur on the data that may present difficulties when we try to fit the data to a curve that reasonably represents the efficiency-versus-flow rate. If I break this up into smaller pieces and examine the output at each stage, I find that I can significantly improve this result. Here, for instance, I "step" in an easterly direction, initially specifying a counterclockwise orientation. Then I iterate over small chunks, and evaluate the results after each iteration. When the curve stops increasing in the _x-_direction (i.e., when the largest column value isn't equal to the step size), I change the orientation to keep the trace following the desired curve. (Note that there are lots of combinations of parameters that will work here; this step took some experimentation!)

sz = 25; ind = sz; nIters = 100; allBs = nan(sz*nIters,2); Pnew = P; %reset h = text(200,385,'Direction:','FontSize',14,... 'FontWeight','bold','color','w','HorizontalAlignment','left') for ii = 1:nIters if ind == sz direction = 'counterclockwise'; else direction = 'clockwise'; end set(h, 'string',['Direction: ',direction]) %B yields [ROW,COL]: B = bwtraceboundary(curveOfInterest,Pnew,'E',8,sz,direction); hold on; plot(B(:,2),B(:,1),'r.');drawnow; allBs((ii-1)*sz+1:(ii-1)*sz+sz,:) = B; [~,ind] = max(B(:,2)); % Update the starting point, Pnew: Pnew = B(ind,:); end

That result looks great, and will facilitate a good curve fit!

Using regionprops

Anyone who has ever heard me lecture about image processing has likely heard me mention that regionprops can make difficult problems easy. Let's first use bwmorph to find the intersection of the two curves; it will present as a branch point in the skeletonized curve.

branchPoints = bwmorph(curveOfInterest,'branchpoints');

Now dilate that point and use it to break the curves at their branches.

branchPoints = imdilate(branchPoints,strel('disk',3)); curveOfInterest = curveOfInterest & ~branchPoints; imshow(curveOfInterest) axis([100 420 180 400])

In this zoomed view, we can see that we've now broken the curve into branches. We can also verify that we now have multiple ROIs...

cc = bwconncomp(curveOfInterest) cc = Connectivity: 8 ImageSize: [738 1067] NumObjects: 11 PixelIdxList: {1x11 cell}

...and we can use their shape properties to select the ones we want. (I will select the ones comprising more than 10 connected pixels, and that have a positive orientation (angle from the horizontal).

stats = regionprops(cc, 'Area','Orientation','Centroid'); idx = find([stats.Area] > 10 & [stats.Orientation] > 0); curveOfInterest = ismember(labelmatrix(cc),idx); clf imshow(curveOfInterest) figure [ys,xs] = find(curveOfInterest); plot(xs,ys,'r.','MarkerSize',7) axis ij axis equal Next up: fitting a curve

So that essentially wraps up the "image processing aspect" of the problem, and from here, it becomes a problem of fitting a curve. For completeness, I'll hijack Steve's blog one more time to show how the Curve-Fitting Toolbox facilitates that process. See you next week!

The complete series \n'); d.write(code_string); // Add copyright line at the bottom if specified. if (copyright.length > 0) { d.writeln(''); d.writeln('%%'); if (copyright.length > 0) { d.writeln('% _' + copyright + '_'); } } d.write('\n'); d.title = title + ' (MATLAB code)'; d.close(); } -->


Get the MATLAB code (requires JavaScript)

Published with MATLAB® R2013b

, or you % can check out his many . -Steve_ % %% Quick recap % Continuing from % , we're heading down the path of extracting real % data from the "efficiency curve" in a graphical representation of those % data. To recap, here's the original image: %% % % <> % %% % Here are the steps we need to repeat from % url = 'http://blogs.mathworks.com/images/steve/2013/example-pump-perf-curve.jpg'; chart = imread(url); bw = im2bw(imcomplement(rgb2gray(chart))); filled = imfill(bw,'holes'); cc = bwconncomp(filled); stats = regionprops(cc,'Area'); A = [stats.Area]; [~,biggest] = max(A); filled(labelmatrix(cc)~=biggest) = 0; bb = regionprops(filled,'BoundingBox'); %We'll use this later! bb = bb.BoundingBox; chart(~repmat(filled,[1 1 3])) = 0; curveOfInterest = rgb2ycbcr(chart); curveOfInterest = curveOfInterest(:,:,2); %% % By the end of the first post, I had managed to reduce the SNR so that I % ended up with an image (curveOfInterest) that contains a single % region...that partially isolates the two blue curves in the top axes: %% % % <> % %% Next up, segmentation % Many (perhaps most) image processing problems require some approach to % "segmenting" the imageREPLACE_WITH_DASH_DASHcreating a binary mask that shows white ("true," % or 1) where we have signal, and black ("false," or 0) elsewhere. Now that % I've masked and converted the original image to YCbCr % ("luminance/chrominance/chrominance"), and selected the second (Cb) plane % of the resulting image, segmentation is relatively easy. I'm going to use % % to explore options for doing that: %% % % <> % %% % Simple thresholding works well here (try it!), but I'm going to use a % different approachREPLACE_WITH_DASH_DASHone that sometimes succeeds where thresholding fails. curveOfInterest = imextendedmax(im2double(curveOfInterest),0.30,8); imshow(curveOfInterest) title('Binary Curves') %% Morphologically clean the image % For subsequent analysis, and extraction of data, it will be useful to % % the image. % % will be your best friend in that regard! ;) %% % % <> % %% % After playing around a bit, I decided to dilate the image, and then to % skeletonize it and remove spurs. % ( % makes both of those latter processes easy.): curveOfInterest = imdilate(curveOfInterest, strel('Disk',6)); curveOfInterest = bwmorph(curveOfInterest, 'skeleton', Inf); curveOfInterest = bwmorph(curveOfInterest, 'spur', 11); imshow(curveOfInterest) %% % If the curves in that image don't look contiguous, recognize that that's % just a screen-resolution issue. There's now exactly one region in this % image: cc = bwconncomp(curveOfInterest); cc.NumObjects %% Extracting the curve of interest % *And now for the difficult part* % % Okay, we're well on our way. We've eliminated the black and red curves % that we don't need, and we've got the curve of interest reduced to a % skeleton that contains the minimum information that we _do_ need. But we % still have to find a way to extract _only_ the efficiency curve, and to % discard the curve representing "334-mm" of head. We can't use color; the % curves in the original image are identical. So we're left with % differentiating the curves based on their morphological (shape) % characteristics. %% % *I'm going to do this two different ways* % % Sometimes it's useful to be able to trace the boundary of a region, or % (as we do here), to trace along a particular path. |bwtraceboundary| will % allow us to do just that. %% BWTRACEBOUNDARY % If you've never used it before, take a moment to read the % . % You'll find that it allows you to operate on a binary image by specifying % a starting point (_P_), an initial trajectory (_fstep_) along a path in % the image, and a direction (_dir_) in which you want to trace the path. % ("Direction" here allows you to specify the overall orientation of the % boundary you want to trace.) %% % *Automating the determination of the starting point* % % Recalling that the curve of interest always starts in the lower left of % the upper axes, we can readily automate the determination of the starting % point. (It will be the point closest to the bottom left of the masked % axes.) I'm going to use |regionprops| to determine the bounding box of % the masked axes (currently contained in variable "filled"). However, the % doc for regionprops indicates that BoundingBox is measured from the upper % left; I need the lower left so I'll have to manipulate the calculation of % the bounding box a bit: targetPoint = regionprops(filled,'BoundingBox'); targetPoint = targetPoint.BoundingBox(1:2) + [0 targetPoint.BoundingBox(4)]; [ys,xs] = find(curveOfInterest); %% % Now it remains to get the distance from all points to the lower left corner % of the bounding box, and to find the point that minimizes that distance. % (I'm going to use a Statistics Toolbox function here, but this is a % relatively easy function to write if you don't have that Toolbox.) [~,ind] = min(pdist2(targetPoint,[xs,ys])); P = [ys(ind),xs(ind)]; %(Careful here: P is represented as [row,col], not [x,y]!) %% % *Now we need to _orient_ the trace* % % Consider the results using two different orientations: N = 1000; %If I leave this unbounded (Inf), all points on the curve will be detected! % Counterclockwise Bccw = bwtraceboundary(curveOfInterest,P,'NE',8,N,'counterclockwise'); imshow(curveOfInterest); hold on plot(P(2),P(1),'go','MarkerSize',12) plot(Bccw(:,2),Bccw(:,1),'r','LineWidth',5); title('Counterclockwise') hold off %% % Now clockwise. Bcw = bwtraceboundary(curveOfInterest,P,'NE',8,N,'clockwise'); imshow(curveOfInterest); hold on plot(P(2),P(1),'go','MarkerSize',12) plot(Bcw(:,2),Bcw(:,1),'r','LineWidth',5); title('Clockwise') hold off %% % That's interesting; I see that if I specify a clockwise orientation for % my curve (and if I use a sufficiently large _N_), I can constrain the % boundary-tracing algorith to _roughly_ follow the efficiency curve I'm % after. However, there's a significant spur on the data that may present % difficulties when we try to fit the data to a curve that reasonably % represents the efficiency-versus-flow rate. If I break this up into % smaller pieces and examine the output at each stage, I find that I can % significantly improve this result. Here, for instance, I "step" in an % easterly direction, initially specifying a counterclockwise orientation. % Then I iterate over small chunks, and evaluate the results after each % iteration. When the curve stops increasing in the _x-_direction (i.e., % when the largest column value isn't equal to the step size), I change the % orientation to keep the trace following the desired curve. (Note that % there are lots of combinations of parameters that will work here; this % step took some experimentation!) %% % sz = 25; % ind = sz; % nIters = 100; % allBs = nan(sz*nIters,2); % Pnew = P; %reset % h = text(200,385,'Direction:','FontSize',14,... % 'FontWeight','bold','color','w','HorizontalAlignment','left') % for ii = 1:nIters % if ind == sz % direction = 'counterclockwise'; % else % direction = 'clockwise'; % end % set(h, 'string',['Direction: ',direction]) % %B yields [ROW,COL]: % B = bwtraceboundary(curveOfInterest,Pnew,'E',8,sz,direction); % hold on; plot(B(:,2),B(:,1),'r.');drawnow; % allBs((ii-1)*sz+1:(ii-1)*sz+sz,:) = B; % [~,ind] = max(B(:,2)); % % Update the starting point, Pnew: % Pnew = B(ind,:); % end %% % % <> % %% % That result looks great, and will facilitate a good curve fit! %% Using |regionprops| % Anyone who has ever heard me lecture about image processing has likely % heard me mention that % % can make difficult problems easy. Let's first use |bwmorph| to find the % intersection of the two curves; it will present as a branch point in the % skeletonized curve. branchPoints = bwmorph(curveOfInterest,'branchpoints'); %% % Now dilate that point and use it to break the curves at their branches. branchPoints = imdilate(branchPoints,strel('disk',3)); curveOfInterest = curveOfInterest & ~branchPoints; imshow(curveOfInterest) axis([100 420 180 400]) %% % In this zoomed view, we can see that we've now broken the curve into % branches. We can also verify that we now have multiple ROIs... cc = bwconncomp(curveOfInterest) %% % ...and we can use their shape properties to select the ones we want. (I % will select the ones comprising more than 10 connected pixels, and that % have a positive orientation (angle from the horizontal). % stats = regionprops(cc, 'Area','Orientation','Centroid'); idx = find([stats.Area] > 10 & [stats.Orientation] > 0); curveOfInterest = ismember(labelmatrix(cc),idx); clf imshow(curveOfInterest) %% figure [ys,xs] = find(curveOfInterest); plot(xs,ys,'r.','MarkerSize',7) axis ij axis equal %% Next up: fitting a curve % So that essentially wraps up the "image processing aspect" of the % problem, and from here, it becomes a problem of fitting a curve. For % completeness, I'll hijack Steve's blog one more time to show how the % Curve-Fitting Toolbox facilitates that process. See you next week! %% The complete series % % * % * % * ##### SOURCE END ##### 5619775b69ae47678e596dab7c40962e -->

Categories: Blogs

day ninety five

Coal Trail - 2014, January 6 - 13:26


At home, every day that should be getting easier for Ben or more "normal" seems to be getting more challenging.  We struggle every day (5x a day) with giving him his medications, getting him to drink 1.5L a day and eat.  He needs to bath frequently and that in itself is getting to be a 1 hour process - just getting him to agree to have a sponge bath.  I'm sure he hates not being able to have a regular shower or bath because his lines need to be kept dry.  We are visiting the hospital at least twice a week now, which means early wake up times and up to two hours in the truck one way - and it is all we can do to get Bennett in the truck and ready to go.  It was nice that over the holidays when we were going in 2-3 times a week that most of the commuters in Calgary were on holidays, and we made it to the hospital in 57 minutes one day!!!
I think for Bennett, most of the struggles come from the fact that he is feeling so well, yet he is still confined to home and very limited visitors.  I really don't think that Bennett is trying to make it as hard as it currently is (for his parents or for him!), I just think it's starting to wear on him.  He wants to be the one going to school with his sisters and his friends, he wants to be the one going to Fernie with his cousins, he wants to not worry about his line, his blood counts, taking his "motor oil" medicine and the pain and sores in his mouth.

Although it is a challenge with Bennett some days, we still continue to be grateful for how far we have come.  Bennett has been able to go to COP for one day of skiing on some incredible new gear.  He spent many days during the holidays skating on the amazing outdoor rink in Longview with his cousins.  He also really enjoyed having his sisters home (and to himself) for the two week Christmas break - he is missing them today though!


As we near our 100 days, we have several tests and procedures that are still needed.  On the 100th day, Bennett will have his chimerism test redone.  This is the test that tells us if the donor's marrow is the one producing all the blood in his body, or if Bennett's own marrow has overtaken the donor cells.
I know that although we are near day 100 that we are not near the end.  Earlier this month, we were visiting the hospital once every two weeks.  That only happened once and almost twice, and then things got a little out of whack.  I got really excited about that and even thought that going back to work was imminent!!!  But, now we've been going in at least twice a week for blood work and medication readjustments.  Bennett's cyclosporin levels (his anti-rejection drug) was going really high and then really low - like a yo-yo, I told Ben!  These levels have to be so precise and exact to ensure that his own cells do not take over his donor cells.  All of this yo-yo (ing) has now caused his potassium level to be really high - which now makes for more bloodwork.  The best part is that we have to ensure that Bennett drinks 1.5 litres a day (which is the subject of a whole other blog post - it is far from easy....).
Bennett has also developed some mouth lesions (sores).  They come and go, and are found on the inside of his lips and cheeks.  Recently, his teeth have also become very sore and he does complain of pain when swallowing.  This has really upped the ante on trying to get him to eat and drink (remember far from easy.....)  We have met with our ENT specialist and they have decided (finally!!!) to biopsy the lesions and determine what exactly they are (graft vs host, viral infection, etc (I try not to think about the etc. part).  A surgery really isn't ideal because Bennett is so immuno-compromised but the doctors really don't have any idea what is causing these painful sores and swallowing.  The biopsy will mean that we are admitted for hopefully just one night....at this point, it is looking like later this week or early next week - depending on when they can schedule all the right people.
At this point, we are limiting visitors to immediate family who are healthy and who have received their flu shots.  As the day nears for outside visitors, we know that ALL visitors will need to have their flu shots prior to coming inside our house.  Because Bennett has a brand new immune system that has never been exposed to the flu, this really isn't the time to expose him to the nasty virus that is going around Alberta!
And of course, an extra special thank you to the people who continue to spoil our family!  We really appreciate the many letters, postcards, emails (keep them coming!) and Christmas cards.  Thank you to J. from England for sending a boxful of amazing momentos and handwritten notes.  Thanks to Rebecca and her family for the amazing box of games and surprises!  We are overwhelmed with everyone's generosity and continue to encourage lots of emails/notes/photos for Bennett.  We are unsure at the moment of the exact return date to school and suspect that it could still be months away.......

Categories: Blogs

Flowers From Al 01

Cory Doctorow - 2014, January 6 - 13:21


Here's part one of my 2003 short story "Flowers From Al," written with Charlie Stross for New Voices in Science Fiction, a Mike Resnick anthology. It's a pervy, weird story of transhuman romance.

Mastering by John Taylor Williams: [email protected]

John Taylor Williams is a audiovisual and multimedia producer based in Washington, DC and the co-host of the Living Proof Brew Cast. Hear him wax poetic over a pint or two of beer by visiting livingproofbrewcast.com. In his free time he makes "Beer Jewelry" and "Odd Musical Furniture." He often "meditates while reading cookbooks."

MP3 link

Categories: Blogs

Vaginal Fantasy #24: Guilty Pleasures

Belmont - 2014, January 5 - 14:55

Hosts Felicia Day, Bonnie Burton, Veronica Belmont and Kiala Kazebee talk about “Guilty Pleasures” by Laurell K. Hamilton on the December edition of Vaginal Fantasy Romance Book Club.

Join the forums here!

THIS MONTH’S BOOK: Guilty Pleasures (Anita Blake, Vampire Hunter #1) by Laurell K. Hamilton

NEXT MONTH’S BOOK: The Hundred Thousand Kingdoms (The Inheritance Trilogy, #1) by NK Jemisin

ALT: Archangel by Sharon Shinn

Subscribe to Geek and Sundry: http://goo.gl/B62jl
Join our community at: http://geekandsundry.com/community

Categories: Blogs

Automating the extraction of real data from an image of the data – part 1

Matlab Image processing blog - 2013, December 31 - 05:00

I'd like to welcome back my fellow MATLAB Central blogger Brett Shoelson for the first in a three-part series on extracting curve values from a plot. You can find Brett over at the File Exchange Pick of the Week blog, or you can check out his many File Exchange contributions. -Steve

In my role as an application engineer, I frequently have the opportunity to help customers with their image processing-related problems. In fact, this is one of the aspects of my job that I enjoy the most; each customer's image poses unique--and sometimes difficult--challenges. I get to help solve them, and move on to other things.

Recently, a customer shared with me an image of a chart showing, among other things, the efficiency of a pump as a function of flow rate. He asked if I could help him automate the extraction of real data from this chart; he has to do this over and over, and is currently following a very manual workflow--clicking along the desired curve and recording the positions of his mouseclicks. If you're ever faced with a similar challenge and you can find a way to get the original data from which the chart was created, that's not only significantly easier, but less noisy than extracting the data from an image. But in this case, the original efficiency-versus-flow data were not available, and the customer had no alternative.

ContentsHere's the original image as the customer shared iturl = 'http://blogs.mathworks.com/images/steve/2013/example-pump-perf-curve.jpg'; chart = imread(url); imshow(chart) The consistent features

There are three axes in this image. In the top-most of the axes (which is also the largest), there are curves representing both "Head" (in m) and "Efficiency" (in percent) versus flow rate (m^3/h). (The efficiency is labeled on the right side of the top axis.) The customer was interested only in the data reflected by the "efficiency" curve, as labeled in the image.

What information can we rely on from chart to chart? After asking a few questions, I learned that the efficiency curve always starts in the lower left corner of the top axis. Moreover, the axes and curves in this graph are similar--in form, shape, and color--to the corresponding curves in the many other graphs from which he needs to extract these data. (Otherwise, automating this data extraction would be impossible, and we'd have to try for a more "semi-automated" approach to facilitate manual analyses.)

Take a moment to think about the problem

I like to make the point that solutions to image processing problems are typically non-unique. Your own approach may be different--perhaps radically so--from my own. In fact, I often reflect on problems that I've already solved, and realize that I might solve it differently if I were to face it again.

Before you read my solution, take a moment to look at the customer's image and think about how you would approach the problem. When I'm done sharing my workflow, I will encourage you to comment on it, or even to improve upon it. That way, we can learn from each other!

My solution, in three parts

I'm going to step through the steps I took to automate this data-extraction process. This was somewhat tricky, so to keep this blog post from being too long, I'm going to break it up into three parts:

  • In the first part, I'll isolate the axes that contains the efficiency curve, and then begin to isolate the two blue curves within that axes.
  • In the second post, I'll segment the remaining curves, and process them to isolate only the curve of interest.
  • Finally, in post three, I'll extract the x-y- coordinates of the efficiency curve, and create a custom "fit object" that will allow me to automate the determination of efficiency versus flow rate. Along the way, I'll share my thoughts about this process.
Boosting the "SNR"

The information we're after (the "signal") is represented graphically in the "Efficiency" curve in the top axes. The rest of the curves are just "noise" here, and our initial task is to ascertain how to discard all of those noise sources, while keeping the signal intact.

Recognize that each axis in the original image is delineated by a black border (though that's difficult to see here, with the screen capture). With little effort, we can convert those black borders to white; that will allow us to create masks of each axis, and to select (or discard) specific ones.

imshow(chart) title('Original RGB')

Let's create a reversed, binary version of the original; this will cast the black borders to white:

bw = im2bw(imcomplement(rgb2gray(chart))); % I combined some steps here imshow(bw) title('Grayscaled/Reversed/Binarized')

Now that the axes are bordered in white, we can fill the image regions to create solid regions of interest:

filled = imfill(bw,'holes'); imshow(filled) title('All Regions Filled')

Since I know that the axis I want to keep is the largest one. I can use that information to create a mask that will be useful for discarding a lot of the noise in the original image:

cc = bwconncomp(filled); stats = regionprops(cc,'Area'); A = [stats.Area]; [~,biggest] = max(A); filled(labelmatrix(cc)~=biggest) = 0; imshow(filled); title('Axis mask') bb = regionprops(filled,'BoundingBox'); %We'll use this later! bb = bb.BoundingBox;

(Note that the selection of the region [axes] of interest [axes 4, bottom right] was achieved simply by applying a logical constraint, and that I could easily have elected to use any combination of logical conditions to select the desired axes. For instance, I could have calculated both areas and centroids of the "blobs" in the "All-Regions-Filled" black and white image. Then I could have picked the region that is larger than a threshold area value, and that has a y-centroid above those of the other axes.)

Masking the original color image

Now we can use this mask to pare the original color image, plane-by-plane:

chart(~repmat(filled,[1 1 3])) = 0; imshow(chart) title('Masked RGB Original')

We've now discarded a lot of the noise, but the difficult part remains. Unfortunately, several non-interest curves intersect the curve we need. Even more unfortunately, one of those curves is the same color as the curve of interest! But perhaps we can continue along the path of noise reduction by exploring and using the color information in the image; it's possible that something in a grayscale representation of the image (rgb2gray, or individual red, green, or blue colorplanes) could be useful here. To explore that possibility, I like to use ExploreRGB.

ExploreRGB(chart)

Nothing that's immediately obvious jumps out at me that will make the selection of the efficiency curve any easier. But let's click on the "Advanced-Mode" toolbar button (upper left) to explore different colorspace representations:

Now, by studying the "blue chrominance image" (in the 4th row, 2nd column), we can see that the blue curves can readily differentiated from non-blue curves simply by changing colorspaces.

(Recognize that I could simply click on the blue-chrominance image, and then right-click to export that image to the base workspace, but I'll show the equivalent code here:) Convert to YCbCr, extract blue chrominance image:

curveOfInterest = rgb2ycbcr(chart); curveOfInterest = curveOfInterest(:,:,2); imshow(curveOfInterest); title('Cb/masked');

By the way, I didn't have to create handles to the subplot axes (ax(1)--ax(4), above). But doing so allows me to use expandAxes, which conveniently lets me click on any of those axes to expand it to full screen, and then to right-click to export the image it contains to my MATLAB workspace.

Done...for now.

Let's stop there. Next week, we'll resume the problem and isolate only the efficiency curve. (That, it turns out, is the most difficult part of this problem!)

The complete series \n'); d.write(code_string); // Add copyright line at the bottom if specified. if (copyright.length > 0) { d.writeln(''); d.writeln('%%'); if (copyright.length > 0) { d.writeln('% _' + copyright + '_'); } } d.write('\n'); d.title = title + ' (MATLAB code)'; d.close(); } -->


Get the MATLAB code (requires JavaScript)

Published with MATLAB® R2013b

, or you % can check out his many . -Steve_ % % In my role as an application engineer, I frequently have the opportunity % to help customers with their image processing-related problems. In fact, % this is one of the aspects of my job that I enjoy the most; each % customer's image poses uniqueREPLACE_WITH_DASH_DASHand sometimes difficultREPLACE_WITH_DASH_DASHchallenges. I get % to help solve them, and move on to other things. % % % Recently, a customer shared with me an image of a chart showing, among % other things, the efficiency of a pump as a function of flow rate. He % asked if I could help him automate the extraction of real data from this % chart; he has to do this over and over, and is currently following a very % manual workflowREPLACE_WITH_DASH_DASHclicking along the desired curve and recording the % positions of his mouseclicks. If you're ever faced with a similar % challenge and you can find a way to get the _original_ data from which % the chart was created, that's not only significantly easier, but less % noisy than extracting the data from an image. But in this case, the % original efficiency-versus-flow data were not available, and the customer % had no alternative. %% Here's the original image as the customer shared it url = 'http://blogs.mathworks.com/images/steve/2013/example-pump-perf-curve.jpg'; chart = imread(url); imshow(chart) %% The consistent features % There are three axes in this image. In the top-most of the axes (which is % also the largest), there are curves representing both "Head" (in m) and % "Efficiency" (in percent) versus flow rate (m^3/h). (The efficiency is % labeled on the right side of the top axis.) The customer was interested % _only_ in the data reflected by the "efficiency" curve, as labeled in the % image. %% % What information can we rely on from chart to chart? After asking a few % questions, I learned that the efficiency curve always starts in the lower % left corner of the top axis. Moreover, the axes and curves in this graph % are similarREPLACE_WITH_DASH_DASHin form, shape, and colorREPLACE_WITH_DASH_DASHto the corresponding curves in % the many other graphs from which he needs to extract these data. % (Otherwise, _automating_ this data extraction would be impossible, and % we'd have to try for a more "semi-automated" approach to facilitate % manual analyses.) %% Take a moment to think about the problem % I like to make the point that solutions to image processing problems are % typically non-unique. Your own approach may be differentREPLACE_WITH_DASH_DASHperhaps % radically soREPLACE_WITH_DASH_DASHfrom my own. In fact, I often reflect on problems that I've % already solved, and realize that I might solve it differently if I were % to face it again. % % Before you read my solution, take a moment to look at the customer's % image and think about how _you_ would approach the problem. When I'm done % sharing my workflow, I will encourage you to comment on it, or even to % improve upon it. That way, we can learn from each other! %% My solution, in three parts % I'm going to step through the steps I took to automate this % data-extraction process. This was somewhat tricky, so to keep this blog % post from being too long, I'm going to break it up into three parts: % % * In the first part, I'll isolate the axes that contains the efficiency curve, % and then begin to isolate the two *blue* curves within that axes. % * In the second post, I'll segment the remaining curves, and process them % to isolate only the curve of interest. % * Finally, in post three, I'll extract the x-y- coordinates of the % efficiency curve, and create a custom "fit object" that will allow me to % automate the determination of efficiency versus flow rate. Along the way, % I'll share my thoughts about this process. %% Boosting the "SNR" % The information we're after (the "signal") is represented graphically in % the "Efficiency" curve in the top axes. The rest of the curves are just % "noise" here, and our initial task is to ascertain how to discard all of % those noise sources, while keeping the signal intact. % % Recognize that each axis in the original image is delineated by a black % border (though that's difficult to see here, with the screen capture). % With little effort, we can convert those black borders to white; that % will allow us to create masks of each axis, and to select (or discard) % specific ones. imshow(chart) title('Original RGB') %% % Let's create a reversed, binary version of the original; this will cast % the black borders to white: bw = im2bw(imcomplement(rgb2gray(chart))); % I combined some steps here imshow(bw) title('Grayscaled/Reversed/Binarized') %% % Now that the axes are bordered in white, we can fill the image regions to % create solid regions of interest: filled = imfill(bw,'holes'); imshow(filled) title('All Regions Filled') %% % Since I know that the axis I want to keep is the largest one. I can use % that information to create a mask that will be useful for discarding a % lot of the noise in the original image: cc = bwconncomp(filled); stats = regionprops(cc,'Area'); A = [stats.Area]; [~,biggest] = max(A); filled(labelmatrix(cc)~=biggest) = 0; imshow(filled); title('Axis mask') bb = regionprops(filled,'BoundingBox'); %We'll use this later! bb = bb.BoundingBox; %% % (Note that the selection of the region [axes] of interest [axes 4, bottom % right] was achieved simply by applying a logical constraint, and that I % could easily have elected to use any combination of logical conditions to % select the desired axes. For instance, I could have calculated both areas % and centroids of the "blobs" in the "All-Regions-Filled" black and white % image. Then I could have picked the region that is larger than a % threshold area value, and that has a _y_-centroid above those of the other % axes.) %% Masking the original color image % Now we can use this mask to pare the original color image, plane-by-plane: chart(~repmat(filled,[1 1 3])) = 0; imshow(chart) title('Masked RGB Original') %% % We've now discarded a lot of the noise, but the difficult part remains. % Unfortunately, several non-interest curves intersect the curve we need. % Even more unfortunately, one of those curves is the same color as the % curve of interest! But perhaps we can continue along the path of noise % reduction by exploring and using the color information in the image; it's % possible that something in a grayscale representation of the image % (|rgb2gray|, or individual red, green, or blue colorplanes) could be % useful here. To explore that possibility, I like to use % . % % ExploreRGB(chart) %% % % <> % %% % Nothing that's immediately obvious jumps out at me that will make the % selection of the efficiency curve any easier. But let's click on the % "Advanced-Mode" toolbar button (upper left) to explore different % colorspace representations: %% % % <> % %% % Now, by studying the "blue chrominance image" (in the 4th row, 2nd % column), we can see that the blue curves can readily differentiated from % non-blue curves simply by changing colorspaces. %% % (Recognize that I _could_ simply click on the blue-chrominance image, and % then right-click to export that image to the base workspace, but I'll % show the equivalent code here:) % Convert to YCbCr, extract blue chrominance image: curveOfInterest = rgb2ycbcr(chart); curveOfInterest = curveOfInterest(:,:,2); imshow(curveOfInterest); title('Cb/masked'); %% % By the way, I didn't _have_ to create handles to the subplot axes % (ax(1)REPLACE_WITH_DASH_DASHax(4), above). But doing so allows me to use % , % which conveniently lets me click on any of those axes to expand it to % full screen, and then to right-click to export the image it contains to % my MATLAB workspace. %% Done...for now. % Let's stop there. Next week, we'll resume the problem and isolate only % the efficiency curve. (That, it turns out, is the most difficult part of % this problem!) %% The complete series % % * % * % * ##### SOURCE END ##### 27ca62f2167440ce9eafe4ab2ff9eab1 -->

Categories: Blogs

Favorite Books of 2013

Flog - 2013, December 27 - 14:21

I loathe top 10 lists, and at the same time, always read them and get great tips for stuff I enjoy later.  So…yeah.  Hipster hypocrite, thy name is Felicia.

Anyway, I have read some great books this year (branched out a bit from traditional romances) and wanted to share my faves with you! ONLY FIVE OF THEM BECAUSE HIPSTER ME DOESNT WANNA DO TEN!


Golem and Jinni by Helene Wecker

This is one of my favorite books of the year. I didn’t know a ton about it going in, other than loving the gorgeous cover, but I’m very glad I didn’t. It is a historical urban fantasy of sorts, about a Golem and a Djinn separately stranded in turn-of-the-century New York city. The two character’s storylines intertwine beautifully, with themes of identity, religion and friendship weaving in and out of a wonderfully detailed world. If you liked The Night Circus, or Dr Strange and Mr. Norrell, you’ll really love this book.


How to be a Woman by Caitlin Moran

I don’t read a ton of biographies but this one was so outrageous and smart and funny that it made me a fan of the genre.  Caitlin Moran is unabashedly a feminist, but she fills out the word in a way you actually WANT to identify with the phrase. Weaving her opinions in and out of her crazy family stories, she spurs the reader to re-think concepts of woman’s rights and feminism, which for some weird reason have accumulated a negative stigma over the last few decades. In a day and age where more and more legislation is being passed against BIRTH CONTROL (I mean, what the hell people?!) we need more writers, men OR women, to put sexual issues in plain, and hilariously dirty language like this author.  Recommended!


Alex Verus Novels by Benedict Jacka

Technically this is not a book, it’s a series, but I gobbled all 5 (6?) of them up in a few weeks. It’s an urban fantasy series set in modern day London about mages and secret light and dark magic and YUM is it fun!  I love the main character, his magic is divination, so he can see all possible outcomes, which is dealt with in a really fun and believable way.  His assistant is a girl who has a curse, and the characters are really fun and multi-dimensional.  Very evocative of the Harry Dresden books (not a bunch of romance, for those Vaginal Fantasy ladies out there, sorry!)


Quiet by Susan Cain

I have had a lot of issues I’ve been dealing with this year, mental and physical, and it was wonderful to read this book about where some of the possible roots lay, and how to cope with them.  This book was so helpful and methodical in highlighting the differences between extroverts and introverts, and how society has an (unjustified) bias towards the more outgoing people.  It had great suggestions in how to cope and assert yourself in an organic way, how there is a lot of strength in being the type of person who can be quiet and thoughtful, and that good leadership isn’t always about the brashest and most dynamic.  Highly recommend for other shy people out there!


Alif the Unseen by G Willow Wilson

Another Genie book!  This one is set in the modern-day Middle East, and it mixes mythology and technology in a really fun, engrossing story.  Alif is a hacker in love with a woman above his station, who gets dragged into the hidden world of genies while trying to avoid frightening prosecution by the state, who has eyes everywhere, especially the internet.  It’s a romance, a thriller, a mythic epic and a religious contemplation all in one.  Wonderful insight into Islamic culture as well as the magic of Djinn.  So inventive and original, even if it sounds like a strange mash-up, try it!  You’ll enjoy it!

Let me know what really struck you this year!  It will allow me to avoid OTHER people’s top 10 lists
HAPPY NEW YEAR!

 

Categories: Blogs

Rowdy Rowdy Eleanor Goudie

Fighting the Dragons - 2013, December 26 - 22:59


I started this post this morning and it was titled "Simply Having a Wonderful Christmastime" and was basically just photos from the past month. I didn't get a chance to finish it because Eleanor woke from her nap a little off and we have spent the last 5 hours watching her like hawks. The last four days have been very busy. We stretched our girls pretty far and had them out late and going all day. Penelope was pretty obvious when she got fed up. She would cry and cling tightly to me or Kris and not make eye contact with any admirers. Eleanor, however, was the centre of attention. She was bossing everyone around ("more Jingle Bell Rock?") and was the life of every party. She loves the spotlight and turns into a little Energizer bunny. She just keeps going and going and going and going...
Bossing her beloved JackBut it seems that today she really hit the wall. She woke up groggy and played a little bit, but then was content snuggling and watching Winnie the Pooh. She nearly sat through the entire thing, which is unheard of for her. Kris made her favourite yummy dinner (yummy chicken - everything is yummy at the moment) and she wouldn't touch it. Her temperature was high, but not quite a fever. We gave her some Tylenol and another dose of cortef and waited an hour. Nothing changed, so we gave her another dose of cortef. That seemed to do the trick and about an hour later she started eating and drinking again. Her temperature came back down to normal and she perked back up. Kris and I had a moment of "what the fuck?" because seriously? Too much Christmas sends this kid into adrenal crisis? She has had an incredible year and has been in such good health, I feel totally overwhelmed when something like this happens. I hope when she wakes up tomorrow she is back to her regular self. We are cancelling the rest of our holiday parties, though! Girlfriend likes to pretend she can keep up with the big boys, and she is such a good little actress I thought she could.

***
I've spent a good deal of time over the holidays thinking about the ghosts of Christmases past. We have had four with Eleanor, FOUR. That feels like so many for such a tiny girl. The first she was only two weeks old, and I spent most of it lying on the hide-a-bed in the tv/playroom at my parents house, watching episode after episode of Mad Men and nursing Eleanor. It was pretty awesome. The second was awful. We were discharged from Children's with the intent of having a quiet Christmas and instead ended up in the PICU at Vic General and watched a young man die. We got out just in time to celebrate, but ended up back in the hospital on Christmas Eve when she pulled her NG tube out. Last year everything was good, but we had no family around, so it felt a little lonely. This is the first Christmas she has been healthy and I guess I got a little carried away with how much she could handle. It doesn't help that this holiday season came right on the tail end of a very busy month of hospital visits.
Anyway, here are some of the lovely photos of our very special Christmas - our first with Penelope!
Who eats, and eats, and eats...


The carousel at Butchart Gardens



 The Festival of Trees at the Empress


Gingerbread!

Playing til the breaka breaka dawn at our old friend's Christmas Eve party (well, til 9:30 but it might as well have been the break of dawn)

 Oh, speaking of... this was the sunrise on Christmas morning. You can almost hear the angels singing :-)


These dolls will be the death of me...



Happy Holidays!

Categories: Blogs

Christmastime daddy-daughter podcast with Poesy

Cory Doctorow - 2013, December 23 - 10:42

Every year, there's a day or two between the date that my daughter's school shuts and the day that my wife's office shuts for Christmas holidays. Those are the official seasonal mid-week daddy-daughter days, and for the past two years, my daughter and I have gone to my office to record a podcast. Last year's was great, but I think we hit a new high this year (MP3).

Categories: Blogs

Lawful Interception 04

Cory Doctorow - 2013, December 23 - 10:39

Here's part four of a reading of my novella Lawful Interception, a sequel, of sorts, to Little Brother and Homeland. In addition to the free online read, you can buy this as an ebook single (DRM-free, of course!)

(Image: Yuko Shimizu)

Mastering by John Taylor Williams: [email protected]

John Taylor Williams is a audiovisual and multimedia producer based in Washington, DC and the co-host of the Living Proof Brew Cast. Hear him wax poetic over a pint or two of beer by visiting livingproofbrewcast.com. In his free time he makes "Beer Jewelry" and "Odd Musical Furniture." He often "meditates while reading cookbooks."

MP3 link

Categories: Blogs

Play Little Brother Jeopardy! online

Cory Doctorow - 2013, December 17 - 05:07


Don Liebold teaches High School English in Milwaukee, where he and his class read my novel Little Brother. He writes: "To celebrate finishing the book, we are playing Jeopardy tomorrow in class. Here is round 1, and here is round 2.

Those are tough questions! I missed a couple!

Categories: Blogs

Lawful Interception 03

Cory Doctorow - 2013, December 17 - 00:35

Here's part three of a reading of my novella Lawful Interception, a sequel, of sorts, to Little Brother and Homeland. In addition to the free online read, you can buy this as an ebook single (DRM-free, of course!)

(Image: Yuko Shimizu)

Mastering by John Taylor Williams: [email protected]

John Taylor Williams is a audiovisual and multimedia producer based in Washington, DC and the co-host of the Living Proof Brew Cast. Hear him wax poetic over a pint or two of beer by visiting livingproofbrewcast.com. In his free time he makes "Beer Jewelry" and "Odd Musical Furniture." He often "meditates while reading cookbooks."

MP3 link

Categories: Blogs

The 8 Must-Have Gadgets for the Holidays

Belmont - 2013, December 11 - 16:43

With so many new technology releases around the holidays, it can be daunting to determine what’s awesome and what’s just a lot of hype. Luckily, we’ve done the legwork for you and found all the very best geek-chic gadgets out there this holiday season – find out which items made our list right here on The Sync Up!

Subscribe to POPSUGAR Girls’ Guide
Are we friends yet? Join us on Facebook!
Get the latest updates via Twitter!
Find us on Instagram!
Follow us on Tumblr!
Add us on Google+!

Categories: Blogs

Image processing with a GPU

Matlab Image processing blog - 2013, December 10 - 11:19

I'd like to welcome guest blogger Anand Raja for today's post. Anand is a developer on the Image Processing Toolbox team. -Steve

Many desktop computers and laptops now come with fairly powerful Graphics Processing Units (GPU's). Initially, GPU's were mostly used to power computations for graphics applications, but soon people realized that they are just as useful for any kind of numerical computing.

GPU's are made of a large number of processing units which by themselves aren't very powerful, but become formidable when used in tandem. So, if you have processing to be done that is parallelizable, the GPU will be a great fit.

With that in mind, isn't it almost obvious that image processing is a great fit for GPU's! A lot of image processing algorithms are data-parallel, meaning the same task/computation needs to be performed on many elements of the data. Lots of image processing algorithms either operate on pixels independantly or rely only on a neighborhood around pixels (like image filtering).

So, lets get down to it. My desktop computer has a GPU, and I want to do some image processing using my favorite software (no prizes for guessing), MATLAB. Note that in order to interact with the GPU from MATLAB, you require the Parallel Computing Toolbox.

I can use the gpuDevice function to get information about my GPU.

gpuDevice ans = CUDADevice with properties: Name: 'Tesla C2075' Index: 1 ComputeCapability: '2.0' SupportsDouble: 1 DriverVersion: 5.5000 ToolkitVersion: 5 MaxThreadsPerBlock: 1024 MaxShmemPerBlock: 49152 MaxThreadBlockSize: [1024 1024 64] MaxGridSize: [65535 65535 65535] SIMDWidth: 32 TotalMemory: 5.6368e+09 FreeMemory: 5.5362e+09 MultiprocessorCount: 14 ClockRateKHz: 1147000 ComputeMode: 'Default' GPUOverlapsTransfers: 1 KernelExecutionTimeout: 0 CanMapHostMemory: 1 DeviceSupported: 1 DeviceSelected: 1

Seeing that I have a supported GPU, I can read an image and transfer the image data to my GPU using the constructor for the gpuArray class. The gpuArray object is used to access and work with data on the GPU.

im = imread('concordaerial.png'); imGPU = gpuArray(im); imshow(imGPU);

So imGPU is a gpuArray object containing data of type uint8.

class(imGPU) classUnderlying(imGPU) ans = gpuArray ans = uint8

A number of the functions in the Image Processing Toolbox have support for GPU processing in R2013b. This means you can accelerate existing MATLAB scripts and functions with minimal changes. To find the list of functions that are supported for GPU processing in the Image Processing Toolbox, you can visit this page. Some of the basic image processing algorithms like image filtering, morphology and edge detection have GPU support and this list is going to grow in the coming releases.

Let's look at a small example to set the ball rolling. Inspired by Brett Schoelson's guest post a few months back about Photoshop-like effects in MATLAB, I thought I might do one of my own. I call it the canvas effect . The canvas effect gives an image the feel of a canvas painting. I had created this little function that does it.

type canvasEffect function out = canvasEffect(im) % Filter the image with a Gaussian kernel. h = fspecial('gaussian'); imf = imfilter(im,h); % Increase image contrast for each color channel. ima = cat( 3, imadjust(imf(:,:,1)), imadjust(imf(:,:,2)), imadjust(im(:,:,3)) ); % Perform a morphological closing on the image with a 11x11 structuring % element. se = strel('disk',9); out = imopen(ima,se);

It's fairly straight-forward. I first smooth the image with a Gaussian kernel to round off some edges. Then to give the effect of more vivid colors, I increase the contrast for each color channel and finally a morphological opening gives it the canvas painting look. Ofcourse, you could add more bells and whistles by providing additional inputs for the filter kernel size and structuring element, but I wanted to keep it simple.

The script below reads an aerial image and gives it that canvas painting effect.

type canvasAerialCPU % Read the image. im = imread('concordaerial.png'); % Produce canvas effect. canvas = canvasEffect(im); %Display the canvas-ed image. figure; imshow(canvas);

All the processing in the script above was done on the CPU. To move the computation to the GPU, I need to transfer the image from the CPU to the GPU using the gpuArray constructor. So the new script would like this:

type canvasAerialGPU run canvasAerialGPU % Read the image. im = imread('concordaerial.png'); % Transfer data to the GPU. imGPU = gpuArray(im); % Produce canvas effect. canvasGPU = canvasEffect(imGPU); % Gather data back from the GPU. canvas = gather(canvasGPU); %Display the canvas-ed image. figure; imshow(canvas);

Wasn't that easy! All I had to do was convert the image to a gpuArray and gather data back after all the computation was done. The function canvasEffect did not have to change at all. This was because all functions used in canvasEffect were supported for GPU computing.

Let's see how much of a win this is in terms of performance. For a few years now I've been using the timeit function that Steve put on the File Exchange. From R2013b, the timeit function is part of MATLAB.

cpuTime = timeit(@()canvasEffect(im), 1) cpuTime = 3.1311

This function however can only be used to benchmark computations undertaken by the CPU. For the GPU, a special benchmarking function gputimeit has been provided. This function ensures that all computations have completed on the GPU before recording the finish time.

gpuTime = gputimeit(@()canvasEffect(imGPU), 1) gpuTime = 0.2130

So with these small changes, I was able to get a considerable speed-up. Imagine having to do this on an entire data set of images. Working with the GPU would save a lot of processing time.

speedup = cpuTime/gpuTime speedup = 14.6990

This is not the complete picture though. I have not accounted for the time it takes to transfer data from the CPU to the GPU and back. This may or may not be significant, depending on how long the computations themselves take. As a rule of thumb, minimize data transfers to and from the device.

transferTimeToGPU = gputimeit(@()gpuArray(im), 1) transferTimeToCPU = gputimeit(@()gather(canvasGPU), 1) gpuTime = transferTimeToGPU + gpuTime + transferTimeToCPU; speedup = cpuTime/gpuTime transferTimeToGPU = 0.0037 transferTimeToCPU = 0.0074 speedup = 13.9753

I'm going to end with some pointers about the performance of GPU processing.

  1. We've seen in the simple example above that you can get a significant speed-up using the supported functions. However, this speed-up is highly dependent on your hardware. If you have a very capable CPU with multiple cores and a not-so-good GPU, the speed-up can appear to be poor because functions like imfilter and imopen are multi-threaded on the CPU. Similarly, if you have a reasonable GPU on a not-so-capable CPU, you're speed-up can make you're GPU execution look faster than it is.
  2. The speed-up achieved is dependent on image size. At smaller image sizes, the overhead of parsing input arguments and moving data to and from the GPU contribute to lower speed-ups. Here's an example that demonstrates this.
% Define image sizes over which to measure performance. sizes = [100 500 2000 4000]; % Preallocate timing arrays. [cpuTime,gpuTime,transferTimeToGPU,transferTimeToCPU] = deal(zeros('like',sizes)); for n = 1 : numel(sizes) size = sizes(n); % Resize image to size x size. im_scaled = imresize(im,[size size]); % Transfer resized image to GPU. imGPU_scaled = gpuArray(im_scaled); % Process image on GPU. canvasGPU_scaled = canvasEffect(imGPU_scaled); % Time CPU execution. cpuTime(n) = timeit(@()canvasEffect(im_scaled), 1); % Time GPU execution. transferTimeToGPU(n) = gputimeit(@()gpuArray(im_scaled) , 1); gpuTime(n) = gputimeit(@()canvasEffect(imGPU_scaled), 1); transferTimeToCPU(n) = gputimeit(@()gather(canvasGPU_scaled) , 1); end gpuTotalTime = transferTimeToGPU+gpuTime+transferTimeToCPU; % Plot CPU vs GPU execution figure; plot(sizes, cpuTime, 'rx--',... sizes, gpuTotalTime,'bx--',... 'LineWidth',2); legend('cpu time','gpu time'); xlabel('image size [n x n]'); ylabel('execution time'); title('cpu time vs gpu time'); figure; plot(sizes,cpuTime./gpuTotalTime,'LineWidth',2); xlabel('image size [n x n]'); ylabel('speed up'); title('Speed up');

I hope this got you as excited about image processing with GPU's as it did me!

\n'); d.write(code_string); // Add copyright line at the bottom if specified. if (copyright.length > 0) { d.writeln(''); d.writeln('%%'); if (copyright.length > 0) { d.writeln('% _' + copyright + '_'); } } d.write('\n'); d.title = title + ' (MATLAB code)'; d.close(); } -->


Get the MATLAB code (requires JavaScript)

Published with MATLAB® R2013b

. %% % I can use the function to get information about my GPU. gpuDevice %% % Seeing that I have a supported GPU, I can read an image and transfer the % image data to my GPU using the constructor for the % class. % The gpuArray object is used to access and work with data on the GPU. im = imread('concordaerial.png'); imGPU = gpuArray(im); imshow(imGPU); %% % So imGPU is a gpuArray object containing data of type uint8. class(imGPU) classUnderlying(imGPU) %% % A number of the functions in the Image Processing Toolbox have support % for GPU processing in R2013b. This means you can accelerate existing % MATLAB scripts and functions with minimal changes. To find the list of % functions that are supported for GPU processing in the Image Processing % Toolbox, you can visit % page. Some % of the basic image processing algorithms like image filtering, morphology % and edge detection have GPU support and this list is going to grow in the % coming releases. %% % Let's look at a small example to set the ball rolling. Inspired by Brett % Schoelson's guest % % a few months back about Photoshop-like effects in MATLAB, I thought I % might do one of my own. I call it the _canvas effect_ . The canvas % effect gives an image the feel of a canvas painting. I had created this % little function that does it. type canvasEffect %% % It's fairly straight-forward. I first smooth the image with a Gaussian % kernel to round off some edges. Then to give the effect of more vivid % colors, I increase the contrast for each color channel and finally a % morphological opening gives it the canvas painting look. Ofcourse, you % could add more bells and whistles by providing additional inputs for the % filter kernel size and structuring element, but I wanted to keep it % simple. %% % The script below reads an aerial image and gives it that canvas % painting effect. type canvasAerialCPU %% % All the processing in the script above was done on the CPU. To move the % computation to the GPU, I need to transfer the image from the CPU to the % GPU using the gpuArray constructor. So the new script would like this: type canvasAerialGPU run canvasAerialGPU %% % Wasn't that easy! All I had to do was convert the image to a gpuArray % and gather data back after all the computation was done. The function % canvasEffect did not have to change at all. This was because all % functions used in canvasEffect were supported for GPU computing. %% % Let's see how much of a win this is in terms of performance. For a few % years now I've been using the % function that Steve put on the File Exchange. From R2013b, the % function is % part of MATLAB. cpuTime = timeit(@()canvasEffect(im), 1) %% % This function however can only be used to benchmark computations % undertaken by the CPU. For the GPU, a special benchmarking function % has % been provided. This function ensures that all computations have completed % on the GPU before recording the finish time. gpuTime = gputimeit(@()canvasEffect(imGPU), 1) %% % So with these small changes, I was able to get a considerable speed-up. % Imagine having to do this on an entire data set of images. Working with % the GPU would save a lot of processing time. speedup = cpuTime/gpuTime %% % This is not the complete picture though. I have not accounted for the % time it takes to transfer data from the CPU to the GPU and back. This may % or may not be significant, depending on how long the computations % themselves take. As a rule of thumb, minimize data transfers to and from % the device. transferTimeToGPU = gputimeit(@()gpuArray(im), 1) transferTimeToCPU = gputimeit(@()gather(canvasGPU), 1) gpuTime = transferTimeToGPU + gpuTime + transferTimeToCPU; speedup = cpuTime/gpuTime %% % I'm going to end with some pointers about the performance of GPU % processing. % % # We've seen in the simple example above that you can get a significant % speed-up using the supported functions. However, this speed-up is highly % dependent on your hardware. If you have a very capable CPU with multiple % cores and a not-so-good GPU, the speed-up can appear to be poor because % functions like imfilter and imopen are multi-threaded on the CPU. % Similarly, if you have a reasonable GPU on a not-so-capable CPU, you're % speed-up can make you're GPU execution look faster than it is. % # The speed-up achieved is dependent on image size. At smaller image % sizes, the overhead of parsing input arguments and moving data to and % from the GPU contribute to lower speed-ups. Here's an example that % demonstrates this. % Define image sizes over which to measure performance. sizes = [100 500 2000 4000]; % Preallocate timing arrays. [cpuTime,gpuTime,transferTimeToGPU,transferTimeToCPU] = deal(zeros('like',sizes)); for n = 1 : numel(sizes) size = sizes(n); % Resize image to size x size. im_scaled = imresize(im,[size size]); % Transfer resized image to GPU. imGPU_scaled = gpuArray(im_scaled); % Process image on GPU. canvasGPU_scaled = canvasEffect(imGPU_scaled); % Time CPU execution. cpuTime(n) = timeit(@()canvasEffect(im_scaled), 1); % Time GPU execution. transferTimeToGPU(n) = gputimeit(@()gpuArray(im_scaled) , 1); gpuTime(n) = gputimeit(@()canvasEffect(imGPU_scaled), 1); transferTimeToCPU(n) = gputimeit(@()gather(canvasGPU_scaled) , 1); end gpuTotalTime = transferTimeToGPU+gpuTime+transferTimeToCPU; % Plot CPU vs GPU execution figure; plot(sizes, cpuTime, 'rxREPLACE_WITH_DASH_DASH',... sizes, gpuTotalTime,'bxREPLACE_WITH_DASH_DASH',... 'LineWidth',2); legend('cpu time','gpu time'); xlabel('image size [n x n]'); ylabel('execution time'); title('cpu time vs gpu time'); figure; plot(sizes,cpuTime./gpuTotalTime,'LineWidth',2); xlabel('image size [n x n]'); ylabel('speed up'); title('Speed up'); %% % I hope this got you as excited about image processing with GPU's as it % did me! ##### SOURCE END ##### e8110affd0e5481fb7a10def479af542 -->

Categories: Blogs

Little Brother stageplay now available for local performances

Cory Doctorow - 2013, December 10 - 10:03

Josh Costello is the playwright who created the award-winning, sold-out stage adaptation of my novel Little Brother. Now, he writes, "The stage adaptation of Cory's novel Little Brother was a big hit in San Francisco in 2012, and the script is now available for licensing. Want to see Little Brother on stage in your city? Playwright Josh Costello has posted information about licensing the script, along with video clips, photos, and reviews from the premiere. If you know folks in your local theatre community that might be interested in producing the play, this is where to send them.

"Theaters will want to know that the adaptation is a full-length play for three actors (one woman and two men). There is also a large-cast version available (which might especially appeal to schools). A script sample (as published in Theatre Bay Area Magazine) is available for download.

If you’re an Artistic Director or Literary Manager looking for a small-cast play that speaks to the present moment, that appeals to a younger audience, and that comes with a sizable fan base eager to buy tickets, please take a look and let me know you’d like to read the script.

If you’re looking for a large-cast play with great parts for young people — perhaps for production at a high school or college — there is a large-cast version of the adaptation available as well.

If you’re a fan of the book and you want to see the play performed in your city, let’s make it happen. If you know folks in your local theatre community, tell them about the play and send them here (and let me know as well so I can follow up). If you don’t know anyone, let me know and I’ll see if I can make a connection.

The premiere production in San Francisco was immensely exciting, artistically fulfilling for me personally, and tremendously successful. I know it would be a hit in other cities as well — Cory Doctorow’s novel is so good, and the story is as relevant as ever. Let’s do it again.

Let’s get Little Brother back on stage

Categories: Blogs

Coming to Edinburgh tomorrow night

Cory Doctorow - 2013, December 10 - 05:34

Tomorrow night, I'll be at Edinburgh's Pulp Fiction Books for a talk and signing! It's free to attend (but ticketed, due to limited space), and runs from 7PM to 8:30. Hope to see you!

Categories: Blogs

Peak indifference to surveillance

Cory Doctorow - 2013, December 10 - 00:59


In my latest Guardian column, I suggest that we have reached "peak indifference to spying," the turning point at which the number of people alarmed by surveillance will only grow. It's not the end of surveillance, it's not even the beginning of the end of surveillance, but it's the beginning of the beginning of the end of surveillance.

We have reached the moment after which the number of people who give a damn about their privacy will only increase. The number of people who are so unaware of their privilege or blind to their risk that they think "nothing to hide/nothing to fear" is a viable way to run a civilisation will only decline from here on in.

And that is the beginning of a significant change.

Like all security, privacy is hard. It requires subtle thinking, and the conjunction of law, markets, technology and norms to get right. All four of those factors have been sorely lacking.

The default posture of our devices and software has been to haemorrhage our most sensitive data for anyone who cared to eavesdrop upon them. The default posture of law – fuelled by an unholy confluence of Big Data business models and Greater Manure Pile surveillance – has been to allow for nearly unfettered collection by spies, companies, and companies that provide data to spies. The privacy norm has been all over the place, but mostly dominated by nothing-to-hide. And thanks to the norm, the market for privacy technology has been nearly nonexistent – people with "nothing to fear" won't pay a penny extra for privacy technology.

We cannot afford to be indifferent to internet spying

(Image: Anonymity, Privacy, and Security Online/Pew Center)

Categories: Blogs

Lawful Interception 02

Cory Doctorow - 2013, December 9 - 12:37

Here's part two of a reading of my novella Lawful Interception, a sequel, of sorts, to Little Brother and Homeland. In addition to the free online read, you can buy this as an ebook single (DRM-free, of course!)

(Image: Yuko Shimizu)

Mastering by John Taylor Williams: [email protected]

John Taylor Williams is a audiovisual and multimedia producer based in Washington, DC and the co-host of the Living Proof Brew Cast. Hear him wax poetic over a pint or two of beer by visiting livingproofbrewcast.com. In his free time he makes "Beer Jewelry" and "Odd Musical Furniture." He often "meditates while reading cookbooks."

MP3 link

Categories: Blogs

All snow and all play make Eleanor something something

Fighting the Dragons - 2013, December 5 - 15:02

Go crazy?
Don't mind if I do!!!!
Like with hippies, you should never go with a toddler to a second location. Stuff gets weird. Running around in the snow after play group with the cute German boy? Fun! Continuing the fun at the beach with the dog? Disaster!


We go NOW!
Kris showed his Russian hairdresser some photos of the girls and she quickly quipped "That one (Eleanor) Irish. That one (Penelope) Baltic." It's true that Penny looks more like Kris and Eleanor looks more like me, but I have never seen it more clearly than when it starts to snow. Have you ever seen anyone so peaceful? Tiny girlfriend was made for snow. She was absolutely serene today. Eleanor stayed true to her mad Irish ways and only settled down from complete hysterics when we came home and had hot chocolate. 

(Emphatically signing "good")
Oh lord. Never a dull moment with that girl!

Categories: Blogs

Games I’ve Been Playing

Flog - 2013, December 4 - 18:48

I have been doing some gaming lately, because I miss it and am trying to be on brain hiatus to get some creative regeneration going on, (and I’ve been enjoying myself IMMENSELY), so I thought I’d share some of my favorites with you here!

Dead Rising 3

Got an Xbox One (please don’t start with the rivalry, it makes me soooo tired, lol) and am enjoying it, except for when the ever-listening Kinect starts to do a task without me meaning it to, chatting over headsets can be a bit of a landmine, lol.  Anyway, this was a launch title for the console and it is FUUUUN, especially playing with friends online.  I have not played any of these Dead Rising games before, and I feel sad because of it. This thing is like a Tarantino movie in gaming form.  You’re fighting zombies in an open-world environment, but it’s so over the top bloody and ridiculous, you kill SO MANY ZOMBIES in SO MANY DISGUSTING WAYS, that you can’t help but giggle at the ridiculousness. I can see why this game got banned in Germany, and there’s a part of me that REALLY needs to be offended by the sexism and the weird offensive bosses, but there’s a visceral satisfaction to getting in a steam roller while wearing a tuxedo and a lego head (you unlock tons of outfit options) and crushing hundreds of zombies.  The fountain of blood spurting is so ridiculous you can’t take it seriously.  Recommended for Left 4 Dead fans, and Skyrim fans.  OMG a game like this set in a RPG setting would be my DREAM GAME.  And if I could play a girl character

Oh, and here’s a nice Easter egg I found: An upscale woman’s clothing shop with my name, I got a FABULOUS women’s button up shirt there, SO CHIC!

 

Long Live the Queen

 

This is a weird Japanese game that is SUPER similar to a game I loved many years ago called “Princess Maker” except it’s less offensive since you’re playing the actual Princess, and not a nebulous “ward” raising her (and possibly marrying her in one outcome, WHAT?!)  It’s a life simulator game, where you train the princess in certain skills, and she gets bonuses on the skills depending on her mood.  As she levels, problems arise, and if you didn’t raise her right, basically she dies in a litany of gruesome outcomes.  It’s kind of amazing.  I have YET to beat the game, I’ve gotten far, but time and time again I’ve come upon an encounter that requires a skill I don’t have and BLECK I’m poisoned or executed or whatever. It’s totally excellent and odd and a style of game that is unique and definitely worth checking out.  It looks SUPER girly at first glance, but I think people will enjoy the strategy, woman or man!

 

Papers Please

Even weirder than Long Live the Queen (and that’s saying a LOT) is this indie game that has won a ton of awards. Ever dream of approving passports in a virtual communist state?  Imprisoning people in order to not have your kids starve to death?  Well, this game can make that wish come true! It has a big strategy element in that it’s almost a memory game.  You’re assigned to a border crossing, and have to let people into your country based on checking an increasing number of criteria in their paperwork, and if you mess up you make less money (and your family starves to death).  And you need volume in order to get enough money to keep the lights on, so you have to go quickly, which increases your margin of error. Within the sessions of passport approval are a few sub-games, like a conspiracy you can join, serial killers and a few other fun things.  It’s a struggle to juggle all of it, and that’s part of the fun.  And the sound effects are really weird, my dog was quite disturbed when I played it, haha.

 

Lemee know if you’re playing something you recommend, I’m counting the days until Peggle 2. MONDAY MONDAY MONDAY!

Categories: Blogs

Video of the Doctor Who Pre-Show on BBC America!

Belmont - 2013, December 4 - 11:28

Sorry for the late posting, things have been crazy with the holidays! Here’s the video of the pre-show we did for the Doctor Who 50th Anniversary Pre-Show!

Categories: Blogs