HOF's on Hubble's Ultra Deep Field image

I'm trying to count the galaxies in Hubble's Ultra Deep Field Image. I created a False Color image from a certain threshold, the red channel > 10 turns black otherwise red. What I'm left with is a big list of 0, 0, 0 255 and 255, 0, 0, 255 color pixels. I've tried HOF's and the for-loop to isolated the double, triple and quadriple 0, 0, 0, 255 entries to 1 single entry. Is their an elegant way to do this? For example Keep 1 item if the item successively has more of the same items?

Here the link to my project:

Thank you for the effort.

Looks good, but can you elaborate more on what you are trying to do. Are your trying to combine large areas of black into on entry in a list, or something else?

Thank you @legoman3.14, What I'm trying to accomplish is grouping groups of successive 0, 0, 0, 255 pixels into 1 entry or keeping one entry in for example a group of 3 and change the other entries back to a red pixel. 255, 0, 0, 255.

There is a slightly modified program https://snap.berkeley.edu/snap/snap.html#present:Username=dardoro&ProjectName=Galaxy_count
Data is processed as a List< boolean> then rendered.
Costumes are stored and can be cycled with [space] to check the accuracy of results.
Left edge detection, as You proposed, has much fewer false positives as compared to the first phase threshold normalization but still too much.
More accurate results can be achieved with flood fill or recursive sharpening.
@bh Maybe APL library can be used to calculate the convolution of an image with a sharpening matrix.
Or @toontalk AI Snap extension can be used https://ecraft2learn.github.io/ai/.

Untitled script pic
I'm sure this could be made more elegant, but this'll do it.

You can store the state of the last visited pixel then check if the next becomes active. With data encoded as empty=false, non empty=true it can be processed with
Galaxy_count script pic (2)
Galaxy_count script pic (1)

Brian, tried it. But in my case this works only in groups of three of the same records. The groups of one, two and four get lost. According to wikipedia the Hubble's UDF image contains approximately 10.000 galaxies. This number you get when you combine the pixels 0, 0, 0 255 on the clean false-color image. I assumed the successive records are belonging to one and the same galaxy. That's why I tried to reduce all groups of 1, 2, 3, 4, 5 to one record.

@dardoro You took the script way to far for me. Still thinking about you're block is_become_true_val? Thanks for that block, it is harder then I thought it would be "count the coloured pixels from an image".

Ah, I misunderstood what you were asking for. In that case, keep
(value = red) or (item (index+1) = red)

But, I'm confused, if you do that and there's more than one such cluster, you won't know where they appeared in the data after this operation. Right?

That's right @bh, It's only use is for counting the unique records (without the group, which I assume are from the same galaxy). But I might be wrong about that. The purpose of this project was to count the individual galaxies in Hubble's UDF image. Wikipedia and NASA tells us the number should be around 10000 galaxies. I find this incomprehensible in it's beauty to know the patch of sky Hubble was looking at is smaller than the moon at night. 10.000 galaxies x 100 - 400.000.000.000 stars. So in this project I try to count the galaxies...I tried

And got the number 10491

Sounds good... How big is the initial list of pixels?

@bh the initial pixel list has a size of 129600 pixels. This is what I got until now..I added some comments on the steps I took.

Wow so this approach really worked! Congratulations!

@bh @dardoro in Both cases tried it on the one dot star chart. Both scripts seem to work well on finding the first pixel of a star in a row, finding the relation in pixels in the y direction which belongs to the same star is cumbersome. Bonus, what you get in return is a nice crescent moon. Thank you for you're help.


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.