You are hereFeed aggregator

Feed aggregator


Victor J.Bartuska, co-founder of ex-Chemagnetics, died

NMR blog - 2015, April 8 - 05:00

Vanishing traces of a chapter in US NMR history.

Categories: Blogs

Displaying a color gamut surface

Matlab Image processing blog - 2015, April 3 - 13:40

Today I'll show you one way to visualize the sRGB color gamut in L*a*b* space with assistance with a couple of new functions introduced last fall in the R2014b release. (I originally planned to post this a few months ago, but I got sidetracked writing about colormaps.)

The first new function is called boundary, and it is in MATLAB. Given a set of 2-D or 3-D points, boundary computes, well, the boundary.

Here's an example to illustrate.

x = gallery('uniformdata',30,1,1); y = gallery('uniformdata',30,1,10); plot(x,y,'.') axis([-0.2 1.2 -0.2 1.2]) axis equal

Now compute and plot the boundary around the points.

k = boundary(x,y); hold on plot(x(k),y(k)) hold off

"But, Steve," some of you are saying, "that's not the only possible boundary around these points, right?"

Right. The function boundary has an optional shrink factor that you can specify. A shrink factor of 0 corresponds to the convex hull. A shrink factor of 1 gives a compact boundary that envelops all the points.

k0 = boundary(x,y,0); k1 = boundary(x,y,1); hold on plot(x(k0),y(k0)) plot(x(k1),y(k1)) hold off legend('Original points','Shrink factor: 0.5 (default)',... 'Shrink factor: 0','Shrink factor: 1')

Here's a 3-D example using boundary. First, the points:

P = gallery('uniformdata',30,3,5); plot3(P(:,1),P(:,2),P(:,3),'.','MarkerSize',10) grid on

Now the boundary, plotted using trisurf:

k = boundary(P); hold on trisurf(k,P(:,1),P(:,2),P(:,3),'Facecolor','red','FaceAlpha',0.1) hold off

The second new function I wanted to mention is rgb2lab. This function is in the Image Processing Toolbox. The toolbox could convert from sRGB to L*a*b* before, but this function makes it a bit easier. (And, if you're interested, it supports not only sRGB but also Adobe RGB 1998).

Just for grins, let's reverse the a* and b* color coordinates for an image.

rgb = imread('peppers.png'); imshow(rgb) lab = rgb2lab(rgb); labp = lab(:,:,[1 3 2]); rgbp = lab2rgb(labp); imshow(rgbp)

Now let's get to work on visualizing the sRGB gamut surface. The basic strategy is to make a grid of points in RGB space, transform them to L*a*b* space, and find the boundary. (We'll use the default shrink factor.)

[r,g,b] = meshgrid(linspace(0,1,50)); rgb = [r(:), g(:), b(:)]; lab = rgb2lab(rgb); a = lab(:,2); b = lab(:,3); L = lab(:,1); k = boundary(a,b,L); trisurf(k,a,b,L,'FaceColor','interp',... 'FaceVertexCData',rgb,'EdgeColor','none') xlabel('a*') ylabel('b*') zlabel('L*') axis([-110 110 -110 110 0 100]) view(-10,35) axis equal title('sRGB gamut surface in L*a*b* space')

Here are another couple of view angles.

view(75,20) view(185,15)

That's it for this week. Have fun with color-space surfaces!

\n'); d.write(code_string); // Add copyright line at the bottom if specified. if (copyright.length > 0) { d.writeln(''); d.writeln('%%'); if (copyright.length > 0) { d.writeln('% _' + copyright + '_'); } } d.write('\n'); d.title = title + ' (MATLAB code)'; d.close(); } -->


Get the MATLAB code (requires JavaScript)

Published with MATLAB® R2015a

.) % % The first new function is called % , and it is % in MATLAB. Given a set of 2-D or 3-D points, |boundary| computes, well, the % boundary. % % Here's an example to illustrate. x = gallery('uniformdata',30,1,1); y = gallery('uniformdata',30,1,10); plot(x,y,'.') axis([-0.2 1.2 -0.2 1.2]) axis equal %% % Now compute and plot the boundary around the points. k = boundary(x,y); hold on plot(x(k),y(k)) hold off %% % "But, Steve," some of you are saying, "that's not the only possible boundary % around these points, right?" % % Right. The function |boundary| has an optional shrink factor that you can % specify. A shrink factor of 0 corresponds to the convex hull. A shrink factor % of 1 is the smallest polygonal region containing all the points. k0 = boundary(x,y,0); k1 = boundary(x,y,1); hold on plot(x(k0),y(k0)) plot(x(k1),y(k1)) hold off legend('Original points','Shrink factor: 0.5 (default)',... 'Shrink factor: 0','Shrink factor: 1') %% % Here's a 3-D example using |boundary|. First, the points: P = gallery('uniformdata',30,3,5); plot3(P(:,1),P(:,2),P(:,3),'.','MarkerSize',10) grid on %% % Now the boundary, plotted using : k = boundary(P); hold on trisurf(k,P(:,1),P(:,2),P(:,3),'Facecolor','red','FaceAlpha',0.1) hold off %% % The second new function I wanted to mention is . This function is in the Image Processing Toolbox. The toolbox % could convert from sRGB to L*a*b* before, but this function makes it a bit % easier. (And, if you're interested, it supports not only sRGB but also Adobe % RGB 1998). % % Just for grins, let's reverse the a* and b* color coordinates for an image. rgb = imread('peppers.png'); imshow(rgb) %% lab = rgb2lab(rgb); labp = lab(:,:,[1 3 2]); rgbp = lab2rgb(labp); imshow(rgbp) %% % Now let's get to work on visualizing the sRGB gamut surface. The basic % strategy is to make a grid of points in RGB space, transform them to L*a*b* % space, and find the boundary. (We'll use the default shrink factor.) [r,g,b] = meshgrid(linspace(0,1,50)); rgb = [r(:), g(:), b(:)]; lab = rgb2lab(rgb); a = lab(:,2); b = lab(:,3); L = lab(:,1); k = boundary(a,b,L); trisurf(k,a,b,L,'FaceColor','interp',... 'FaceVertexCData',rgb,'EdgeColor','none') xlabel('a*') ylabel('b*') zlabel('L*') axis([-110 110 -110 110 0 100]) view(-10,35) axis equal title('sRGB gamut surface in L*a*b* space') %% % Here are another couple of view angles. view(75,20) %% view(185,15) %% % That's it for this week. Have fun with color-space surfaces! ##### SOURCE END ##### 572c2a833e74401ea84693131262123a -->

Categories: Blogs

Two new books: Prepolarized 129Xe-NMR, and a Fetal Imaging tome

NMR blog - 2015, April 3 - 05:00

Put side-by-side, these books illustrate the width of MR applications.

Categories: Blogs

Breaking News: The End of Spin ... and of The Universe

NMR blog - 2015, April 1 - 05:00

New insights on the Big Crunch and its impact on Magnetic Resonance.

Categories: Blogs

Clean Reader is a free speech issue

Cory Doctorow - 2015, March 30 - 08:02


My latest Guardian column, Allow Clean Reader to swap ‘bad’ words in books – it’s a matter of free speech expands on last week’s editorial about the controversial ebook reader, which lets readers mangle the books they read by programatically swapping swear-words for milder alternatives.

I agree with the writers who say that the app is offensive, and that it makes books worse. Where I part company with Clean Reader’s detractors is where they claim that it is — or
should be — illegal. If we don’t have the right to make our computers alter the things we show us, what happens to ad blockers, or apps that auto-annotate politicians’ claims, or warn you when you’re reading an article in a newspaper owned by Rupert Murdoch?

Free speech isn’t just the right to express yourself, it’s also the right not to listen. I disagree with the decision to use Clean Reader, and that’s why it’s a free speech issue. If you don’t support the legal right to utter speech you find offensive, you don’t support speech at all. That doesn’t mean we shouldn’t tell people not to use Clean Reader, or withhold our books from Clean Reader’s store. It means we can’t call for Clean Reader to be banned.

I want a future where readers get to decide how they read. I want to be able to make and share annotations to climate-denial bestsellers – even if that means deniers can mark up Naomi Klein’s This Changes Everything and share their notes. I want to be able to turn Oxford commas off and on. I want to be able to change the font, block the ads, and swap cliched passages for humorous alternatives. I want Bechdelware that let me choose to genderswap the characters. I want sentiment analysis that tries to sync a music playlist with the words I read.

I want people to be able to do stupid things with their computers. Because more than anything, I want computer users to have the final say about what their computers do.

That includes kids, by the way. It’s one thing for an adult to use Clean Reader to make her reading experience accord with her preferences. The same principle that says she should be allowed to dictate her computer’s behaviour means her kids should be able to decide for themselves how sweary the books they read are.

Allow Clean Reader to swap ‘bad’ words in books – it’s a matter of free speech [Cory Doctorow/The Guardian]

Categories: Blogs

Europe's NMR spectroscopy above 1 GHz

NMR blog - 2015, March 29 - 05:00

European NMR is racing for the highest possible fields ... alone!?

Categories: Blogs

Fresh comments on the current state of NMR

NMR blog - 2015, March 28 - 12:00

Resume' of the Bruker 2014 public financial statements.

Categories: Blogs

New online documentation system for R2015a

Matlab Image processing blog - 2015, March 25 - 13:45

Earlier this month we shipped R2015a, the first of our two regularly scheduled annual releases. Typically, when there's a new release, I spend some time talking about features that interest me. The feature I want to mention today, though, is a little unusual because it benefits users who haven't even upgraded to the new release yet.

With R2015a, our Documentation and Documentation Tools teams have overhauled the online documentation on mathworks.com. We've learned a lot from your feedback since the last major documentation overhaul a few years ago, and we are excited about the changes.

I'll use the repelem reference page to show you just a few pieces of it. First, notice that the left-side navigation shows you the product (MATLAB) as well as several information categories (Language Fundamentals, Matrices and Arrays, and Sorting and Reshaping Arrays) that are relevant to repelem.

Next, notice the "i" icon to the right of the product name, MATLAB. Click on this icon to get to a standard set of product information, including Getting Started, Examples, Functions and Other Reference, Release Notes, and PDF Documentation.

The left-side navigation also helps you easily see and get around to all the information that's available on the page you're looking at. Below, the left-side navigation is shown directing you to information about one of the input arguments for repelem.

Now if you haven't upgraded to R2015a yet, and if you've tried to use repelem in your MATLAB, then you've already noticed that repelem isn't there. If you scroll all the way down on repelem reference page, it shows you why:

You can see that repelem is new in R2015a! Customers have long been asking us to provide this information on function reference pages. (Please note that the "introduced in" information is not available for all functions yet. It will take us some time to update every page.)

Finally, the new document system displays reference pages and other content very nicely on mobile devices. Here is a screenshot from my phone:

We'd like to know what you think about the new online doc. That would help our teams as they bring these updates to the documentation system that's included with the product. If you have any feedback you'd like to share, please leave a comment below.

\n'); d.write(code_string); // Add copyright line at the bottom if specified. if (copyright.length > 0) { d.writeln(''); d.writeln('%%'); if (copyright.length > 0) { d.writeln('% _' + copyright + '_'); } } d.write('\n'); d.title = title + ' (MATLAB code)'; d.close(); } -->


Get the MATLAB code (requires JavaScript)

Published with MATLAB® R2015a

, the first of our % . Typically, when there's a new % release, I spend some time talking about features that interest me. The % feature I want to mention today, though, is a little unusual because it % benefits users who haven't even upgraded to the new release yet. % % With R2015a, our Documentation and Documentation Tools teams have % overhauled the . We've learned a lot from your feedback since the last % major documentation overhaul a few years ago, and we are excited about % the changes. % % I'll use the to show you just a few pieces of % it. First, notice that the left-side navigation shows you the product % (MATLAB) as well as several information categories % (, % , and % ) that are relevant to |repelem|. % % <> % % Next, notice the "i" icon to the right of the product name, MATLAB. Click % on this icon to get to a standard set of product information, including % , , , % , % and % . % % <> % % The left-side navigation also helps you easily see and get around to all % the information that's available on the page you're looking at. Below, % the left-side navigation is shown directing you to information about % for |repelem|. % % <> % % Now if you haven't upgraded to R2015a yet, and if you've tried to use % |repelem| in your MATLAB, then you've already noticed that |repelem| % isn't there. If you scroll all the way down on |repelem| reference page, % it shows you why: % % <> % % You can see that |repelem| is new in R2015a! Customers have long been % asking us to provide this information on function reference pages. (Please % note that the "introduced in" information is not available for all % functions yet. It will take us some time to update every page.) % % Finally, the new document system displays reference pages and other % content very nicely on mobile devices. Here is a screenshot from my % phone: % % <> % % We'd like to know what you think about the new online doc. That would % help our teams as they bring these updates to the documentation system % that's included with the product. If you have any feedback you'd like to % share, please leave a comment below. ##### SOURCE END ##### 5a00c9cdc9ab42fbadd836f1799dbcd0 -->

Categories: Blogs

Premature departure of Alessandro Bagno

NMR blog - 2015, March 25 - 05:00

A great NMR scientist and a gentle colleagues has left us ...

Categories: Blogs

MATLAB, Landsat 8, and AWS Public Data Sets

Matlab Image processing blog - 2015, March 19 - 08:00

A few weeks ago, a fellow developer (Kelly Luetkemeyer) pulled me into his office to show me something he had been working on. It was very cool! Even better, it's now available for you to try. I'd like to introduce guest blogger Bruce Tannenbaum, product marketing manager for image processing and test & measurement applications, to tell you all about it. Thanks, Bruce!

MathWorks is excited to announce a freely-downloadable MATLAB based tool for accessing, processing, and visualizing Landsat 8 data hosted by AWS as part of its Landsat on AWS Public Data Sets. With this tool, you can create a map display of scene locations with markers that show each scene’s metadata. You can download and visualize thumbnails and any of the 11 spectral bands. The tool includes clickable links for automatically combining and processing spectral bands in a variety of approaches, such as NDVI and color infrared. You can then visualize processed results in MATLAB and create map displays that help convey context about where the data is located on the Earth. This interactive tool is available on MATLAB Central. Watch the video below to see it in action.

 

How MATLAB and Landsat 8 Work Together

MATLAB provides a great environment for working with Earth observation data. It supports a wide range of file formats, including the GeoTIFF format used for Landsat 8 imagery. It supports accessing Earth science data over the Internet, such as the http requests used by AWS Public Data Sets. The Image Processing Toolbox extends MATLAB with a comprehensive set of image processing algorithms and tools for image processing, analysis, visualization, and algorithm development. The interactive tool for Landsat 8 imagery takes advantage of processing techniques in the toolbox, such as adaptive histogram equalization.

One challenge of working with Landsat 8 imagery is that you need to know where each scene is located. A scene represents a specific time and a rectangular surface region of about 185 by 180 kilometers. To make it easier to locate scenes, our tool creates a map display with markers showing the centroid of each scene within the field of view. Additionally, the tool color-codes the markers to show cloud coverage, which is encapsulated in the metadata file for each scene. The most useful images have low cloud coverage; the tool makes it easy to find images with 0-10% or 10-20% cloud coverage, which are the areas that are clearest.

In summary, Landsat 8 is an incredible resource for global change research and has been used in a diverse array of scientific endeavors including the monitoring of deforestation, population growth, and glacier recession. The tool offers a great way for MATLAB users to build on the foundation of AWS support for Landsat 8 imagery. It can also be run on an EC2 instance, which avoids the download time and allows you to process many images in a shorter amount of time. MathWorks would like to hear your feedback, including any additional features you would like to have.

Categories: Blogs

A conversation about privacy and trust in open education

Cory Doctorow - 2015, March 12 - 04:01

For Open Education Week, Jonathan Worth convened a conversation about privacy and trust in open education called Speaking Openly in which educators and scholars recorded a series of videos responding to one another’s thoughts on the subject.

The takes are extremely varied, and come from Audrey Waters, Nishant Shah, Ulrich Boser, Dan Gillmor, and me, and went through two rounds. It’s an exciting way to conduct a dialogue between people who probably couldn’t all get together, and it’s designed to let you add to it.

Cory Doctorow, Audrey Watters, Nishant Shah, Dan Gillmor and Ulrich Boser in Open-Conversation – join us #SpeakingOpenly #OpenEducationWk

Categories: Blogs

Audiobook of Someone Comes to Town, Someone Leaves Town

Cory Doctorow - 2015, March 4 - 00:29


Blackstone has adapted my 2005 urban fantasy novel Someone Comes to Town, Someone Leaves Town for audiobook, narrated by Bronson Pinchot, who does a stunning job.

It’s available as a DRM-free audiobook at all the usual places, including the DRM-free audiobook store Downpour. However, Itunes and Audible refuse to carry this — or any of my other titles — because I won’t allow them to put DRM on them.

Alan is a middle-aged entrepreneur in contemporary Toronto who has devoted himself to fixing up a house in a bohemian neighborhood. This naturally brings him in contact with the house full of students and layabouts next door, including a young woman who, in a moment of stress, reveals to him that she has wings—wings, moreover, that grow back after each attempt to cut them off.

Alan understands. He himself has a secret or two. His father is a mountain, his mother is a washing machine, and among his brothers are a set of Russian nesting dolls.

Now two of the three nesting dolls, Edward and Frederick, are on his doorstep—well on their way to starvation because their innermost member, George, has vanished. It appears that yet another brother, Davey, whom Alan and his other siblings killed years ago, may have returned … bent on revenge.

Under such circumstances it seems only reasonable for Alan to involve himself with a visionary scheme to blanket Toronto with free wireless Internet connectivity, a conspiracy spearheaded by a brilliant technopunk who builds miracles of hardware from parts scavenged from the city’s dumpsters. But Alan’s past won’t leave him alone—and Davey is only one of the powers gunning for him and all his friends.


Someone Comes to Town, Someone Leaves Town
[Downpour]

Categories: Blogs

Charity Stream and Updates!

Flog - 2015, March 2 - 14:02

Well I guess it’s time to get back to…updating my website, lol. It’s been a crazy several months, with every spare minute devoted to finishing up my book! (Did you pre-order? feliciadaybook.com!) I turned in my final edits today, PARTYYYYY! Writing it has taken over a year and a half of my life, I haven’t put my heart into anything this much EVER I think. Maybe Christmas presents I made when I was eight. Anyway, I really hope you enjoy it, and pick it up to support. I think it’s really funny and hopefully inspirational. Objectively.

This week though I’m back on camera, though! (Note to self: Shave legs). Geek and Sundry is doing a 48 hour charity stream on our new twitch channel and I’ll be there for as much as I can and still get sleep so I don’t turn into an insane big-old-B person. (I’m a 9-hour-a-nighter, and I’m perfectly happy spending 1/3 of my life sleeping because it FEELS SO GOOD.) Lots of guests are going to stop by the stream and we’re giving away prizes, it’s gonna be super fun, so tune in Tuesday 5pm PST through Thursday 5pm PST 3/5-3/7 to watch! All donations go to the Lupus Foundation, in honor of my friend Maurissa Tancheroen Whedon, one of the Dr Horrible writers and performers. She’s struggled with some very bad health stuff over the years and raising awareness of how deadly this disease this is super important to me.

Streaming on Twitch is something I’ve come late to on the internet. I was kind of scared to show myself live before in case I said something to offend someone or came across wrong or…I dunno. I have many reasons that are irrational in my life for doing/not doing lots of things, lol. Thank goodness I overcame it! I started my own Twitch channel last October just for the fun of it.My brother and I stream together once a week, and we stream solo once a week. After years of pouring myself into Geek and Sundry, I just wanted to do a few things purely on my own, regardless of the “job” of it. I had no idea what an amazing community would form around my streams!

Around those streams an amazing community has popped up, #TeamHooman, and fans have created a FB page, a Twitter, a website and more. It makes me so happy to have something in my life that is this organic, where people have taken initiative to create something out of the seed we’ve planted. Honestly, it’s why I kept doing The Guild years ago even though it was tough, because the shared ownership with the fans gave the work so much meaning. I hope with the expansion of the Geek and Sundry twitch, along with me and Ryon’s own streaming, that we continue to grow the community in a thoughtful, and supportive way. Twitch has to potential, with close moderation, to do what is the heart of what’s great about the internet: Create community. And what I think, sadly, the platform of YouTube just doesn’t.

So join in one of our streams if you have time! Especially the charity one this week. Here’s all the info: http://geekandsundry.com/view/geek-sundrys-twitch-channel-launches-march-3rd

OKAY BYE!

 

Categories: Blogs

About that dress …

Matlab Image processing blog - 2015, March 1 - 21:29

This afternoon my wife looked at me with an expression indicating she was convinced that I had finally gone completely and irrevocably crazy.

I think it was because I was carrying from room to room a flashlight, a candle, a lighter, and a chessboard scattered with colored craft sticks and puff balls.

"It's about that dress," I said.

Yes, that dress. #TheDress, as it is known on Twitter.

Three nights ago I idly opened Twitter to see what was up. The Internet was melting down over: a) llamas, and b) a picture of a dress. (I have nothing more to say here about llamas.)

This is the picture as it was originally posted on Tumblr:

Image credit: Tumblr user swiked

Some people see this dress as blue and black. Some see it as white and gold. Each group can't understand why the others see it differently.

By Friday afternoon, a myriad of explanations had popped up online and on various news outlets. Mostly, I found these initial attempts to be unsatisfying, although some better explanations have been published online since then.

Initially I didn't want to write a blog about this, because (as I often proclaim) color science makes my brain hurt. But I do know a little bit about how color scientists think, having worked with several, having read their papers and books, and having implemented their methods in software. So, here is my interpretation of this unusual visual phenomenon. It's in three parts:

  • The influence of illumination
  • The phenomenon of color constancy
  • How two different people could arrive at dramatically different conclusions about the color of that dress.

Let's start with the influence of illumination. Here is a small portion of a picture that I took today.

"Sage green," my wife said.

And here's a portion of a different picture.

"That's yellow," came the answer.

The truth: these two colors are from the same location of the same object. Here are the two original images with the locations marked.

Image A

Image B

The chessboard and other objects in these pictures are the same. The difference between the two images is caused entirely by the different light sources used for each one. Just for fun here are the colors of the puff ball on the upper right from two different pictures. (Remember, these are pixels from the exact same spot on the same object!)

The color of the light arriving at the camera depends not only on the color of the object, but also on the nature of the illumination. As you can see in the colored patches above, changing the illumination can make a big difference. So you cannot definitively determine the dress color solely from close examination of the digital image pixels.

Look at Image A again. What is the color of the index card?

Most people would call it white. If you look at just a chunk of pixels from the center of the card, though, it looks like a shade of green.

People have an amazing ability to compensate automatically and unconsciously for different light sources in a scene that they are viewing. If you looked at the same banana under a bright fluorescent light, and in candle light, and in the shade outdoors under a cloudy sky, you would see the banana as having the same color, yellow, each time. That is true even though the color spectrum of the light coming from the banana is actually significantly different in these three scenarios. Our ability to do this is called color constancy.

Our ability to compensate accurately for illumination depends on having familiar things in the scene we are viewing. It can be the sky, the pavement, the walls, the grass, the skin tones on a face. Almost always there is something in the scene that anchors our brain's mechanism that compensates for the illumination.

Now we come back to the photo of the dress. The photo completely lacks recognizable cues to help us compensate for illumination. Our brain tries to do it anyway, though in spite of the lack of cues. The different reactions of people around the world suggest that there are two dramatically different solutions to the problem.

Consider the diagram below. It illustrates two scenes. In the first scene, on the left, a hypothetical blue and black dress is illuminated by one source. In the second scene, on the right, a hypothetical white and gold dress is illuminated by a second source. By coincidence, both scenes produce the same light at the camera, and therefore the two photographs look the same.

Note: I tweaked the diagram above at 02-Mar-2015 14:50 UTC to clarify its interpretation.

For some people, their visual system is jumping to illumination 1, leading them to see a blue and black dress. For others, their visual system is jumping to illumination 2, leading them to see a white and gold dress.

On Friday afternoon, I heard a report that many people in the white and gold camp, when they see a version of the photograph that includes a woman's face, immediately change their perception of the dress to blue and black. This perceptual shift persists even when they view the original photograph again. This demonstrates that the presence of an object with a familiar color can significantly alter our perception of colors throughout the scene.

If you zoom way in and examine the pixels on the dress in the original image, you'll see that they are blue.

So what kind of illumination scenario could cause people to perceive this as white? I asked Toshia McCabe, a MathWorks writer who knows more than I do about color and color systems. She thinks the dress picture was edited from another one in which the dress was underexposed. As a result, the "blue coincidentally looks like a white that is in shadow (but daylight balanced)." In other words, light from a white object in shadowed daylight can arrive as blue light to the camera. So if your eye sees blue pixels, but your brain jumps to the conclusion that the original scene was taken in shaded daylight, then your brain might decide you are looking at a white dress.

For the record, my wife sees a white and gold dress on a computer monitor, but she sees blue and brown when it is printed. I see blue and brown.

Enjoy!

\n'); d.write(code_string); // Add copyright line at the bottom if specified. if (copyright.length > 0) { d.writeln(''); d.writeln('%%'); if (copyright.length > 0) { d.writeln('% _' + copyright + '_'); } } d.write('\n'); d.title = title + ' (MATLAB code)'; d.close(); } -->


Get the MATLAB code (requires JavaScript)

Published with MATLAB® R2014b

> % % Some people see this dress as blue and black. Some see it as white and % gold. Each group can't understand why the others see it differently. % % By Friday afternoon, a myriad of explanations had popped up online and on % various news outlets. Mostly, I found these initial attempts to be % unsatisfying, although some better explanations have been published % online since then. % % Initially I didn't want to write a blog about this, because (as I often % proclaim) color science makes my brain hurt. But I do know a little bit % about how color scientists think, having worked with several, having read % their papers and books, and having implemented their methods in software. % So, here is my interpretation of this unusual visual phenomenon. It's in three parts: % % * The influence of illumination % * The phenomenon of color constancy % * How two different people could arrive at dramatically different % conclusions about the color of that dress. % % Let's start with the influence of illumination. Here is a small portion % of a picture that I took today. % % <> % % "Sage green," my wife said. % % And here's a portion of a different picture. % % <> % % "That's yellow," came the answer. % % The truth: these two colors are from the same location of the same % object. Here are the two original images with the locations marked. % % *Image A* % % <> % % *Image B* % % <> % % The chessboard and other objects in these pictures are the same. The % difference between the two images is caused entirely by the different % light sources used for each one. Just for fun here are the colors of the % puff ball on the upper right from two different pictures. (Remember, % these are pixels from the exact same spot on the same object!) % % <> % % <> % % The color of the light arriving at the camera depends not only on the % color of the object, but also on the nature of the illumination. As you % can see in the colored patches above, changing the illumination can make % a big difference. So you cannot definitively determine the dress color % solely from close examination of the digital image pixels. % % Look at Image A again. What is the color of the index card? % % <> % % Most people would call it white. If you look at just a chunk of pixels % from the center of the card, though, it looks like a shade of % green. % % <> % % People have an amazing ability to compensate automatically and % unconsciously for different light sources in a scene that they are % viewing. If you looked at the same banana under a bright fluorescent % light, and in candle light, and in the shade outdoors under a cloudy sky, % you would see the banana as having the same color, yellow, each time. % That is true even though the color spectrum of the light coming from the % banana is actually significantly different in these three scenarios. Our % ability to do this is called _color constancy_. % % Our ability to compensate accurately for illumination depends on having % familiar things in the scene we are viewing. It can be the sky, the % pavement, the walls, the grass, the skin tones on a face. Almost always % there is something in the scene that anchors our brain's mechanism that % compensates for the illumination. % % Now we come back to the photo of the dress. The photo completely % lacks recognizable cues to help us compensate for illumination. % Our brain tries to do it anyway, though in spite of the lack of cues. % The different reactions of people around the world suggest that % there are two dramatically different solutions to the problem. % % Consider the diagram below. It illustrates two scenes. In the first % scene, on the left, a hypothetical blue and black dress is illuminated by % one source. In the second scene, on the right, a hypothetical white and % gold dress is illuminated by a second source. By coincidence, the two % different illuminants combine with the two different dress color schemes % to produce the same light at the camera, and therefore the same % photograph. % % <> % % For some people, their visual system is jumping to illumination 1, leading % them to see a blue and black dress. For others, their visual system is % jumping to illumination 2, leading them to see a white and gold dress. % % On Friday afternoon, I heard a report that many people in the white and % gold camp, when they see a version of the photograph that includes a % woman's face, immediately change their perception of the dress to blue % and black. This perceptual shift persists even when they view the % original photograph again. This demonstrates that the presence of an % object with a familiar color can significantly alter our perception of % colors throughout the scene. % % If you zoom way in and examine the pixels on the dress in the original % image, you'll see that they are blue. % % <> % % So what kind of illumination scenario could cause people to perceive this % as white? I asked Toshia McCabe, a MathWorks writer who knows more than I % do about color and color systems. She thinks the dress picture was edited % from another one in which the dress was underexposed. As a result, the % "blue coincidentally looks like a white that is in shadow (but daylight % balanced)." In other words, light from a white object in shadowed daylight % can arrive as blue light to the camera. So if your eye sees blue pixels, % but your brain jumps to the conclusion that the original scene was taken % in shaded daylight, then your brain might decide you are looking at a % white dress. % % For the record, my wife sees a white and gold dress on a computer % monitor, but she sees blue and brown when it is printed. I see blue and % brown. % % Enjoy! ##### SOURCE END ##### bab931f1a1f74c43a869ebc2f701d73b -->

Categories: Blogs

Internet-fired elections and the politics of business as usual

Cory Doctorow - 2015, February 27 - 02:49


I’ve got a new Guardian column, Internet-era politics means safe seats are a thing of the past, which analyzes the trajectory of Internet-fuelled election campaigning since Howard Dean, and takes hope in the launch of I’ll Vote Green If You Do.

The Obama campaigns went further. Building on the Dean campaign, two successive Obama campaigns raised millions in small-money donations, creating purpose-built Facebook-like social networks and using them to recruit highly connected supporters to work their way through their social graphs, contacting friends and friends-of-friends to pitch them on donating and voting.

But both times, Obama took office and immediately shut down these grassroots networks. The Obama governance style is big on closed-door, back-room horse-trading – Obama came out of Chicago Democratic Machine politics, after all – and this is fundamentally incompatible with having a bunch of true believers running around waving the flag, making categorical statements about which compromises are (and are not) acceptable.

Governing in tandem with a grassroots is a hard problem. The best example we have of this is the Tea Party, which, despite the big-money backers who bankrolled it, is composed of people who are genuinely passionate about politics and are serious about insisting that the politicians they backed act in accord with their principles.

Leaving aside my political differences with the Tea Party, it’s fair to say that this has been a mixed bag for Republican lawmakers, whose caucus has been responsible for a congressional deadlock that’s run on for years, so that it’s become normal for vital US governmental agencies to shut down and send everyone home until a budget can be passed.

Internet-era politics means safe seats are a thing of the past [The Guardian]

Categories: Blogs

GE ordered by FDA to recall nearly 13'000 MRI scanners!

NMR blog - 2015, February 21 - 05:00

A serious safety problem with some MRI-scanners pinpointed by FDA.

Categories: Blogs

We’re back in action!

Green City Acres - 2015, February 19 - 18:18

After a full winter off farming this year, I’m feeling more recharged then ever, and so stoked to get back in the dirt. We’ve been having an incredibly warm late winter this year, which is perfect, because it’s allowed us to get some critical things done before I take off again to teach a series of workshops in BC, California, and Mexico. I know, poor me right. 

 A lot of what’s happening at this time is greenhouse prep that includes forking and tilling beds in the greenhouses, and the starting of some early nursery stock like tomatoes, peppers, and kale. 

 For my greenhouse prep, I’m forking beds in some of them with a no-til technique. It’s essentially the same as using a broad fork, except I don’t have one yet. Got one being made for this year. I amend the soil with some organic fertilizer and compost by raking it in, then water the beds heavily, then cover them in tarps. This is for a stale seed bed, to encourage weed growth, so that in a couple weeks, I’ll flame weed these beds, and direct seed my first early spring crops into them. The same was done on my other greenhouse, but for these ones I used the tiller. The soil in these ones was a lot more compacted from the previous year, as we had tomatoes in them. So, I rototilled them, then did the same with the stale seed bed approach. 

 This is also the time of year where I bring back my temporary indoor nursery. A shelving unit that I set up in my kitchen that can hold 48 flats for my first early plantings. For the size of my operation, and since I only do a small amount of nursery stock, I prefer to keep these inside as long as possible. It’s cheaper for me to have them in my house, as I’m already paying to heat my house anyways. By about late April, they’ll all go out into my greenhouse, as we’ll need a lot more room to pot up tomatoes and other summer crops. 

 Also, at this time, I’ll start tearing off the tarps at a new plot that I’ll be developing this year. Stay tuned for how all of that unfolds!

 Happy planting, or shovelling snow, wherever you are in the world!

 #theurbanfarmer 

Curtis Stone

Categories: Blogs

Spin Systems, Math Theorems, Sister Celine, and Machine Proofs

NMR blog - 2015, February 19 - 05:00

NMR certainly uses Math. What about the other way round?

Categories: Blogs

'Sight Unseen' Preshow Press Photos

Casey McKinnon - 2015, February 18 - 21:07






I just wanted to post a few preshow press photos we had taken the other day by photographer Ed Krieger for our upcoming production of Sight Unseen at The Lounge Theatre in Hollywood. Every day in rehearsal is a joy and I'm very excited for our opening on March 14th. Tickets are available here!

Categories: Blogs

New comments on sepsis diagnosis by table-top NMR

NMR blog - 2015, February 17 - 05:00

NanoMR sale, E.Fukushima's involvement - and his Arctic NMR plans

Categories: Blogs