A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Two questions



 
 
Thread Tools Display Modes
  #31  
Old September 16th 15, 10:05 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Two questions

In article , Mayayana
wrote:

| the part you don't get is that a filter operation can be parallelized
| and spread across multiple cores.

Like I said, try it for yourself.


not only have i done that but i've written multithreaded code and
benchmarked it.

Why argue with me
when you can monitor your own system?


because you're wrong. very, very wrong.

here's a screenshot i posted befo
http://media.bestofmicro.com/F/L/251985/original/res_app_photoshop.png

Floyd
posted an interesting description that I expect
is probably typical: Limited multi-core utilization,
in some cases.


the limitation depends on the specific calculations and how well it can
be parallelized.
  #32  
Old September 16th 15, 10:06 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Two questions

In article , PeterN
wrote:

Multiple cores can only be used for multiple
threads/processes. If you want to print while
using PS then a second core is nice. You may
also be able to do two things at once in PS.
But even that seems a stretch. If you do something
like apply a filter to a very large image, that's a
single operation. It can only run on one core. And
what else are youy going to do concurrently?

I imagine that's what Chris Cox is talking about.
PS can use the cores if you're demanding functionality
from multiple threads/processes at the same time,
but for one intensive operation, multiple cores will
be slower because each core is slower than the total.

On XP I use 2-core because I don't think XP
can optimally use more. Win7+ is probably
better, but optimization still means having uses for
those cores. Since I'm rarely doing more than two
things at once, I'd rather have two operations
running at 1800 MHz than have 4 cores running
at 900 MHz each, but with only one or two used.

If you're running clean, it's unlikely you'll see
much benefit from more cores, and Intel vs
AMD shouldn't matter. (Though specific CPU
models get different ratings.) In other words,
if you have PS applying a sharpen to a giant
image, maybe it takes 30 seconds, but what
else are you going to have PS do at the same
time that could increase efficiency? Not much.

The only scenario that makes sense to me for
more cores would be a system weighted down
with AV, malware hunters, excessive services,
etc. If you have 4 cores you might be able to
use them all with so much crap running, where
two cores might be forced to allocate time slices
to multiple processes, thus being slightly less
efficient. But aside from servers, it's hard for me
to see the benefit of a large number of cores.
It just means that each core is running slower.

You can research this yourself. Run Task Manager
and then use PS as usual. You'll probably find that
a demanding operation is using 50%, 25%, etc of
the CPU, depending on how many cores you have.
(2 cores -- max intesity is 50% of CPU. 4 cores --
max intensity is 25% of CPU. Etc.)
Is another process or another PS operation maxing
out another core? If your cores are not being used
then the increase in cores is just slowing down your
machine.


Thanks, sounds like good information.


it's completely bogus information.

Since I am doing pre-purchase
research, I will not be doing the experiments. I am thinking quad core
with about a 3.5 - 3.8 CPU. I know there are faster, but I am not yet
convinced that the additional price is worth the extra cost.


that's plenty fast.

what's more important is getting an ssd and a lot of memory, 16 gig
should be sufficient unless you're *really* pushing it hard.
  #33  
Old September 16th 15, 10:06 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Two questions

In article , Mayayana
wrote:

| I would expect it to use multiple cores. I don't use
| Windows or a Mac, but I have system monitoring software
| that graphically shows the load on each core. Typically
| a sharpen operation on a large image takes long enough
| to easily see what actually does happen. On a 8 core
| system about 90 percent of the time is spend with 1
| single core showing 100% usage all of the time, and from
| 2 to 3 other cores being hit repeatedly for short
| intervals. In the last few seconds, which I assume is
| when it puts all of the segments back into the image
| buffer, all 8 cores get hit together for an extended
| period. Extended is longer than the short hits earlier
| in the process, but it doesn't actually last very long
| and the entire process is finished.
|
| Another program uses one single core for the entire
| process.

That's an interesting description. So there seems to
be some optimzing of cores, but in a limited way. I
can't imagine it could be much more efficient than that.


it depends on the calculation being done.

some things are well suited to being parallelized and others are not.

Maybe it's true that an image could be broken up for
sharpening, but that would be awkward, and doesn't
seem to be what's happening in what you describe.


it's not awkward at all and exactly what's happening.

| If you had four cores it would not run at half the
| speed. Same two cores, same speed. Just that
| the other two cores would be idle.

What I mean is that if you have a 4 GHz CPU with
4 cores then it's 1 GHz per core. With 2 cores it's
2 GHz per core. So it will actually be slower to do
a single-core operation on the 4 core than on the
2 core.


where in the world did you come up with that rubbish?

it's 4 ghz *per* core and cores not used are idle.

you have *no* clue on this stuff.
  #34  
Old September 16th 15, 10:06 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Two questions

In article , Mayayana
wrote:

| it needs to respond to clicks but not display and moving it.

In Windows I can call the system to find the mouse
position at any moment. The system is tracking
it. If you move the mouse the system also has
to repeatedly calculate to repaint the screen. That's
what the OS does.


you clearly don't understand what hardware cursor support is.

the cursor position is not calculated by the os, it's managed by
hardware and then added to the frame buffer directly.

you can still query its position and the os will return it, but the key
is that the os is not constantly tracking every movement.

If it didn't do that you wouldn't see the cursor
move. Maybe you can't see any mouse movement
on your Mac? In that case you might be right. Or
your Mac might be dead.


wrong.

All of this is pointless sidetrack. If you actually
read what I wrote you could possibly see that
mouse movement was just an example.


not a very good example, because even if the os was tracking the mouse,
as it did in *much* older systems, it would be add little to no cpu
load.

it's completely insignificant compared to running a sharpening or blur
filter or whatever else.
  #35  
Old September 17th 15, 05:15 AM posted to rec.photo.digital
Floyd L. Davidson
external usenet poster
 
Posts: 5,138
Default Two questions

"Mayayana" wrote:
| If you had four cores it would not run at half the
| speed. Same two cores, same speed. Just that
| the other
| two cores would be idle.

What I mean is that if you have a 4 GHz CPU with
4 cores then it's 1 GHz per core. With 2 cores it's
2 GHz per core. So it will actually be slower to do
a single-core operation on the 4 core than on the
2 core.


All 4 cores are totally independent as for as GHz is
concerned. Just as if they were on 4 separate chips, or
for that matter in 4 separate boxes.

What you are describing is true of a single core that is
hyperthreaded. With that there is one clock and one
core, but the hyperthreading makes it look to the
software as if there are multiple cores.
Multi-threading in that way slows everything down by 1/2
each time the number of threads is doubled. (Actually
it is slightly slower than half due to administrative
overhead for context switching.)

But multi-core CPU's actually have different hardware
for each core, each core runs independently at full
clock speed, asynchronously to the other cores.

Here's the difference in practice... If you do a lot of
text editing, either writing papers or reading and
writing email, having a hyperthreaded single core vs
just a single core is very nice. Clock speed doesn't
make any difference to a text editor, because vastly
more than 99% if all clock cycles are no-ops that just
give up their slice immediately while waiting for
keyboard input. Everything else keeps right on chugging
along without a hitch, yet the reponse time of the
editor to keyboard input is always very quick. Some
interupt that bogs down 1 logical CPU for 2 or 3 seconds
won't cause the keyboard to wait that long. This is
probably very useful for a typical laptop, as an
example.

But on a desktop, if it is used for anything that is CPU
intensive, actual multiple cores are much better. Not
just at speeding up the CPU intensive processes, but
allowing that same quick response time for the keyboard.
What works is 1 core for every CPU process being run in
parallel, plus one for everything else.

The last time I even thought about a single core desktop
was over 20 years ago. I'm not sure about a laptop, but
that wasn't too long after.

--
Floyd L. Davidson http://www.apaflo.com/
Ukpeagvik (Barrow, Alaska)
  #36  
Old September 17th 15, 05:25 AM posted to rec.photo.digital
Floyd L. Davidson
external usenet poster
 
Posts: 5,138
Default Two questions

nospam wrote:
the cursor position is not calculated by the os, it's managed by
hardware and then added to the frame buffer directly.

you can still query its position and the os will return it, but the key
is that the os is not constantly tracking every movement.


So what. It skirts around using the OS, but the entire
question was about CPU use, not the OS. A hardware
cursor uses the CPU. It has to have a free CPU, and by
using the CPU it prevents a lower priority process from
gaining access, or more annoying it might not get a CPU
when needed.

If it didn't do that you wouldn't see the cursor
move. Maybe you can't see any mouse movement
on your Mac? In that case you might be right. Or
your Mac might be dead.


wrong.


Yes you are wrong. He's got a very good point!

All of this is pointless sidetrack. If you actually
read what I wrote you could possibly see that
mouse movement was just an example.


not a very good example, because even if the os was tracking the mouse,
as it did in *much* older systems, it would be add little to no cpu
load.


Bad example on your part. The question is not the load
of the cursor, it's how silky smooth it's movement is,
as opposed to jerking around at noticeable intervals
because no CPU is available to move it.

it's completely insignificant compared to running a sharpening or blur
filter or whatever else.


If some filter or a sharpening process jerks along
without sufficient CPU access the user won't even notice
it.

But any cursor, mouse, or keyboard slowdown that causes
more than an 1/8 second latency will be noticed, and
twice that will be irritating.

--
Floyd L. Davidson http://www.apaflo.com/
Ukpeagvik (Barrow, Alaska)
  #37  
Old September 17th 15, 05:40 AM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Two questions

In article , Floyd L. Davidson
wrote:

the cursor position is not calculated by the os, it's managed by
hardware and then added to the frame buffer directly.

you can still query its position and the os will return it, but the key
is that the os is not constantly tracking every movement.


So what. It skirts around using the OS, but the entire
question was about CPU use, not the OS. A hardware
cursor uses the CPU.


no it doesn't.

the whole point of a hardware cursor is so it *doesn't* use the cpu at
all.

It has to have a free CPU, and by
using the CPU it prevents a lower priority process from
gaining access, or more annoying it might not get a CPU
when needed.


wrong.

If it didn't do that you wouldn't see the cursor
move. Maybe you can't see any mouse movement
on your Mac? In that case you might be right. Or
your Mac might be dead.


wrong.


Yes you are wrong. He's got a very good point!


i'm not wrong and his points (not just this one) are bogus.

All of this is pointless sidetrack. If you actually
read what I wrote you could possibly see that
mouse movement was just an example.


not a very good example, because even if the os was tracking the mouse,
as it did in *much* older systems, it would be add little to no cpu
load.


Bad example on your part. The question is not the load
of the cursor, it's how silky smooth it's movement is,
as opposed to jerking around at noticeable intervals
because no CPU is available to move it.


a hardware cursor is *always* silky smooth because it's completely
independent of the cpu load. that's the whole point.

that means that even if the os crashes and totally locks up the system,
the cursor will still move freely.

it's completely insignificant compared to running a sharpening or blur
filter or whatever else.


If some filter or a sharpening process jerks along
without sufficient CPU access the user won't even notice
it.


if it's video or animation, users may very well see frame drops or
other jerkiness if it can't keep up for whatever reason.

if they're running a filter in photoshop, probably not, although they
might notice it takes a little longer than it normally does.

But any cursor, mouse, or keyboard slowdown that causes
more than an 1/8 second latency will be noticed, and
twice that will be irritating.


that's why it's done in hardware, so that there isn't any latency,
regardless of cpu load.
  #38  
Old September 17th 15, 06:10 AM posted to rec.photo.digital
Floyd L. Davidson
external usenet poster
 
Posts: 5,138
Default Two questions

nospam wrote:
In article , Floyd L. Davidson
wrote:

the cursor position is not calculated by the os, it's managed by
hardware and then added to the frame buffer directly.

you can still query its position and the os will return it, but the key
is that the os is not constantly tracking every movement.


So what. It skirts around using the OS, but the entire
question was about CPU use, not the OS. A hardware
cursor uses the CPU.


no it doesn't.

the whole point of a hardware cursor is so it *doesn't* use the cpu at
all.


Bull****. Not as much, because it doesn't require an OS to
be tracking it, but the CPU absolutely is used.

It has to have a free CPU, and by
using the CPU it prevents a lower priority process from
gaining access, or more annoying it might not get a CPU
when needed.


wrong.


Not in any way that makes a difference, or that you can express
correctly.

The load is reduced, but it is not elminatated.

If it didn't do that you wouldn't see the cursor
move. Maybe you can't see any mouse movement
on your Mac? In that case you might be right. Or
your Mac might be dead.

wrong.


Yes you are wrong. He's got a very good point!


i'm not wrong and his points (not just this one) are bogus.


Dead wrong.

All of this is pointless sidetrack. If you actually
read what I wrote you could possibly see that
mouse movement was just an example.

not a very good example, because even if the os was tracking the mouse,
as it did in *much* older systems, it would be add little to no cpu
load.


Bad example on your part. The question is not the load
of the cursor, it's how silky smooth it's movement is,
as opposed to jerking around at noticeable intervals
because no CPU is available to move it.


a hardware cursor is *always* silky smooth because it's completely
independent of the cpu load. that's the whole point.

that means that even if the os crashes and totally locks up the system,
the cursor will still move freely.


Yeah right.

it's completely insignificant compared to running a sharpening or blur
filter or whatever else.


If some filter or a sharpening process jerks along
without sufficient CPU access the user won't even notice
it.


if it's video or animation, users may very well see frame drops or
other jerkiness if it can't keep up for whatever reason.

if they're running a filter in photoshop, probably not, although they
might notice it takes a little longer than it normally does.

But any cursor, mouse, or keyboard slowdown that causes
more than an 1/8 second latency will be noticed, and
twice that will be irritating.


that's why it's done in hardware, so that there isn't any latency,
regardless of cpu load.


Think skull thinking.

--
Floyd L. Davidson http://www.apaflo.com/
Ukpeagvik (Barrow, Alaska)
  #39  
Old September 17th 15, 06:52 AM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Two questions

In article , Floyd L. Davidson
wrote:

the cursor position is not calculated by the os, it's managed by
hardware and then added to the frame buffer directly.

you can still query its position and the os will return it, but the key
is that the os is not constantly tracking every movement.

So what. It skirts around using the OS, but the entire
question was about CPU use, not the OS. A hardware
cursor uses the CPU.


no it doesn't.

the whole point of a hardware cursor is so it *doesn't* use the cpu at
all.


Bull****. Not as much, because it doesn't require an OS to
be tracking it, but the CPU absolutely is used.


no it isn't. a *separate* chip controls the cursor movements.

It has to have a free CPU, and by
using the CPU it prevents a lower priority process from
gaining access, or more annoying it might not get a CPU
when needed.


wrong.


Not in any way that makes a difference, or that you can express
correctly.

The load is reduced, but it is not elminatated.


there is no load on the cpu.

If it didn't do that you wouldn't see the cursor
move. Maybe you can't see any mouse movement
on your Mac? In that case you might be right. Or
your Mac might be dead.

wrong.

Yes you are wrong. He's got a very good point!


i'm not wrong and his points (not just this one) are bogus.


Dead wrong.


yes, you are.

you are once again talking about things you know nothing about and
assuming that only your way of doing things applies.

All of this is pointless sidetrack. If you actually
read what I wrote you could possibly see that
mouse movement was just an example.

not a very good example, because even if the os was tracking the mouse,
as it did in *much* older systems, it would be add little to no cpu
load.

Bad example on your part. The question is not the load
of the cursor, it's how silky smooth it's movement is,
as opposed to jerking around at noticeable intervals
because no CPU is available to move it.


a hardware cursor is *always* silky smooth because it's completely
independent of the cpu load. that's the whole point.

that means that even if the os crashes and totally locks up the system,
the cursor will still move freely.


Yeah right.


it is as i describe.

it's completely insignificant compared to running a sharpening or blur
filter or whatever else.

If some filter or a sharpening process jerks along
without sufficient CPU access the user won't even notice
it.


if it's video or animation, users may very well see frame drops or
other jerkiness if it can't keep up for whatever reason.

if they're running a filter in photoshop, probably not, although they
might notice it takes a little longer than it normally does.

But any cursor, mouse, or keyboard slowdown that causes
more than an 1/8 second latency will be noticed, and
twice that will be irritating.


that's why it's done in hardware, so that there isn't any latency,
regardless of cpu load.


Think skull thinking.


it's reality.
  #40  
Old September 17th 15, 07:12 AM posted to rec.photo.digital
Sandman
external usenet poster
 
Posts: 5,467
Default Two questions

In article , Mayayana wrote:

The issue is whether PS can actually use multiple cores in any
significant way.


Of course it can.

For that it needs to be running two or more
processor -intensive operations concurrently. Do you often run a
sharpening routine on a 40 MB image while at the same time running
another filter? If not then the sharpening will run faster with less
cores because the core being used will have a higher MHz speed.


Uh, no, most (if not all) graphics operations in PS support threading, as well as
GPU offload, freeing up CPU.

No one needs to go by what I'm saying. In Windows one can run Task
Manager to see usage in real time.


Of course, and there you can see a CPU-heavy operation in Photoshop utilizing
every available core.

There may be something similar on
Macs. So rather than carpet-bombing the thread with empty,
un-qualified pronouncements and insults, why not do some
experimenting for yourself? You can then decide what's best for you.
Maybe you'll find that PS is somehow running your sharpening routine
on 2 cores, but that's *very* unlikely.


Not if you only have two cores. I have sixteen cores and filters obviously runs
on all eight cores:

http://jonaseklundh.se/files/photoshop_cpu.jpg

The routine is a single operation that needs to go through the image
bytes with a math operation. There just aren't two things to do at
once.


Of course there is. This is handled the same way 3D applications split up a given
scene and use a multiple cores, or multiple computers to render each part and
compose it to one frame in the end.

--
Sandman
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
After the Deletion of Google Answers U Got Questions Fills the Gap Answering and Asking the Tough Questions Linux Flash Drives Digital Photography 0 May 7th 07 06:38 PM
Questions on Canon 300D and etc. questions regarding digital photography David J Taylor Digital Photography 10 March 24th 05 05:18 PM
Questions on Canon 300D and etc. questions regarding digital photography Progressiveabsolution Digital Photography 4 March 24th 05 04:11 PM
Questions on Canon 300D and etc. questions regarding digitalphotography Matt Ion Digital Photography 3 March 24th 05 02:57 PM
First SLR questions Rick Digital Photography 26 August 8th 04 12:19 AM


All times are GMT +1. The time now is 02:42 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.