A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Travel without a camera



 
 
Thread Tools Display Modes
  #121  
Old June 22nd 17, 12:09 AM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Travel without a camera

On Wed, 21 Jun 2017 05:53:07 -0700 (PDT), -hh
wrote:

On Wednesday, June 21, 2017 at 12:04:13 AM UTC-4, Eric Stevens wrote:
-hh wrote
Eric Stevens wrote:
[...]
No, but then the processing falls back to the CPU which can be
be quite a bit slower.

True, but only if Adobe actually is using that hardware somehow/somewhere.

Now granted, we do know that *some* filters/etc have been rewritten
by Adobe to leverage GPUs ... but [...]

Overall, [Adobe] only has to 'fall back' for the (still) relatively small
set of subroutines which do actually employ GPU acceleration...if & when
so invoked by one's workflow. For some users, that in of itself may very
well be effectively a "never" right from the start.

And that's why I said that if the difference is invisible to one's
workflow at the productivity level, then it can't have a positive ROI
and may as well not exist for that user.


As I have previously explained, I have no ROI so that wasn't
one of my considerations.


Understood, but the problem with using that approach is that it
also means that you *never* have any justification to buy a new
PC no matter how slow the old one becomes.


Goodness! ROI is not the only means by which one makes an investment.
Very few boats would be sold if that was the case.

And ditto for buying the new software that's calling for higher
hardware specs ... there's no classical 'ROI' for a non-business
application there either to justify the change.


I can just see me stuck there with CS3. :-(

OTOH, if we think of 'ROI' in a less business/finance formalized
sense, then it is more of a "goodness of the choice" metric, and
my point was that if you don't notice any real world effect, then
it probably isn't a big problem that needs addressing ($$).


Some people buy big BMWs and ask for them to be fitted with small
engines.


The plain fact of the matter was that I had an aging
computer and Adobe was rejecting my GPU.


Understood, but did this become known because you noticed
that your workflow slowed down, or merely because you
happened to check the config/specs?


Bit of each. And the realisation had already dawned on me that it was
time I updated my computer.


Plus such requirements are easy-to-test-for Go/NoGo's which may
very well be present for some *different* technical requirement
that's not as easy to test for.

Photoshop seems to test the GPU first and then let you know if it
won't meet their requirements in one manner or another.

True ... but again, that's not proof that said difference actually
gets used by Adobe to any significant degree.


True. But Adobe is making increasing use of the GPU and I would be
surprised if this trend does not continue.


Understood, but this is also a business. Adobe's under pressure
to be [more] profitable, so if there aren't customers screaming
for more power that represents a threat to revenue, the cost to
even merely maintain code will come under pressure. Best bet IMO
is more of a 'status quo', although likely to have some slimming
(in net) of just how many GPU cards that they regularly certify.


I suspect the most pressure will come from video effects.


It will speed up the editing experience.

I doubt it'll speed it up that much if anything, these
cards are made for video , the shading and special effects
filters such as generating real live fire and smoke effects
and high frame rates and resultions.

You are welcome to your doubts.

This is why I decided not to go for a car 0-60 of 3 seconds
as it wouldn't really get me to work any faster than a bus
in a bus lane does.
Might be more fun than sitting the bus though.

The change in bottom-line workflow productivity is really
the only thing that matters. If some Photoshop filter does
actually run faster due to GPU acceleration, it also needs
to be significant enough that the human behind the keyboard
is able to be more productive. Otherwise, its a gain that
has no real world payoff...which means its ROI isn't positive.

I'm retired, and doing it for a hobby. [...]

And thus, more the reason to keep on considering CS5/CS6
characteristics such as via Digital Lloyd's "macperformance guide"
pages, because these are still the present situation for some:
they're not doing PS as a vocation, so they've stuck with the
older versions instead of paying into CC's 'rental' business model.


I'm the reverse. I could never bring myself to pay Adobe's horrendous
up front cost (plus upgrades) but I can tolerate the monthly rental.
It's costing me less than all the things I used to buy in the attempt
to paying for CS5 or CS6.


Historically, I suspect that this all still has a ways to play out.
Adobe did used to offer cheap(ish) initial buy-ins to their product
line such as through Student pricing, which also resulted in a good
chunk of the Hobby/SMB market. Much of the change to CC's rental
market was IMO because these customers weren't buying every upgrade,
in no small part because of the poor ROI being offered by Adobe on
said 'upgrades'. That's also why Adobe also embarked on imposing
additional rules to sunset upgrade qualification on older versions,
so as to force customers "up or out".

Nevertheless, if CS 5/6 suffices for what you need and you can
find a valid license to buy, it probably will pay for itself in
a year or two's worth of CC rentals.

In the end, though, this is all just business and there's not a
good business reason for CC's code to not just be recycled CS code
as much as possible. As such, these old CS observations still
apply until they've been confirmed as no longer applicable.


Plus this is still relevant as a window into Adobe's historical
coding practices, since we also know that CC wasn't a top-
to-bottom 100% code rewrite that got rid of these legacies.


I don't think tghey are doing a rewrite so much as adding new
features while letting the old bits drift slowly out of sight.


Exactly: because of these business interests, CC was not a 100%
"clean sheet" rewrite to make it _better_, but was the more
profitable approach of switching to this rental/lease sales model,
using the legacy CS code with bugfixes, support updates and the
occasional improvement. Its business, not technology.


-hh

--

Regards,

Eric Stevens
  #122  
Old June 22nd 17, 05:41 AM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Travel without a camera

In article , -hh
wrote:


much of what photoshop does is i/o bound, which can't be parallelized.

But if it was I/O bound, then moving a computation from CPU to GPU
card & back wouldn't result in a performance gain.


many times it doesn't, and there is also the overhead of moving it back
and forth too.


If PS is as I/O bound as you claimed, then in all of those instances,
it would never pay to offload to a GPU because the delay in waiting
for the bandwidth to send/fetch would exceed the parallelism speed-up.


straw man.

photoshop uses the gpu only when it will accelerate a given operation.
if that operation would be faster on the cpu, then that's where it will
run.


Which as per your "much of the time" bandwidth claim, will be rare-to-never
and as such, a moot point.


nope


You're trying to deflect from admitting that you were wrong in
disagreeing
with my statement that Adobe is not a particularly sophisticated user of
GPU potential.


of course i disagree. that statement is flat out absurd.

(and you conveniently snipped that text)

As I had said:

"Contrasting that limitation on sophistication, I can personally
recall working on a project with image analysis that used
multiple discrete GPU cards (IIRC, ~8) ... way back in 2004.


*that* is irrelevant (and why it was snipped).

what you did with some random analysis app in 2004 has absolutely
nothing whatsoever to do with photoshop.


Incorrect, because both are processing images, and have a
design goal of leveraging PC hardware to do it quickly.


'processing images' is vague.

Indeed, both leveraged GPU cards for parallelism ... but Adobe only
went to the level of sophistication of a single card.


first you say adobe doesn't have the sophistication to go further, then
you say it was a business decision.

you're also ignoring the other bottlenecks that matter.

they are two totally different apps, with two totally different
code bases and two totally different goals.


Different apps? Sure.
Different code bases? Of course.

Different goals? You can't claim that, because you don't
actually know what specifically this other product did.


i don't need to know specifically what your app did, but it's very safe
to say it did not do *everything* photoshop did nor did it do it in the
ways that photoshop did it, nor was it intended to be a mass market
consumer app selling millions and millions of copies. in other words,
the goals were different.

here's a few things come to mind that i'm *very* sure your app did not
do (or at best, very limited). did your app -
- run on both mac & windows, a requirement that takes priority over a
platform-specific hardware tweak, by using a custom cross-platform
framework written in-house?
- have its own custom virtual memory system that is tuned for image
processing which significantly outperforms what the operating system
can do and can also address a significantly larger memory space?
- increase its performance by using numerous internal image buffers for
luts, multiple undo states and various other things?
- use a platform independent gpu library (pixel bender), also written
in-house?
- support various types of third party plug-ins in a platform agnostic
manner?
- support non-destructive actions?

my guess is a resounding 'no', particularly the custom virtual memory,
which requires a level of sophistication well beyond anything your app
did.

put simply: very different goals.

FYI, what it did can be replicated in PS, but was dog slow.


so what? there are plenty of things that can be done in photoshop that
would be dog slow in your app, if they can be done at all, which would
make your app infinitely slow for those tasks.

you're also oblivious to the fact that photoshop was using multiple
processors in the 1990s, ten years before you were working on that
project. there wasn't much gpu acceleration back then, but there were
dsp cards that dramatically accelerated photoshop.


Oh, I'm quite aware of PS on Dual-CPU Macs in that era ... but
let's also keep in mind that in this period, the CPUs were still
often also "single core". Even so, did Adobe ever progress to
the point of being able to be paralleled across 6 or 8 CPUs?


of course they did, and they saw that there were bottlenecks elsewhere.
adding processors when something else is the bottleneck is a waste.

john nack (then at adobe, now at google) blogged about it, with russell
williams going into detail:

http://blogs.adobe.com/jnack/2006/12..._photoshop_mul
ti_core.html
What may not be obvious to a non-engineer like me, however, is that
not all operations can or should be split among multiple cores, as
doing so can actually make them slower.* Because memory bandwidth
hasn¹t kept pace with CPU speed (see Scott Byer¹s 64-bit article for
more info), the cost of moving data to and from each CPU can be
significant.* To borrow a factory metaphor from Photoshop
co-architect Russell Williams, "The workers run out of materials &
end up standing around."* The memory bottleneck means that multi-core
can¹t make everything faster, and we¹ll need to think about doing new
kinds of processing specifically geared towards heavy computing/low
memory usage.

Because Russell has forgotten more than I will ever know about this
stuff, I¹ve asked him to share some info and insights in the extended
entry.* Read on for more.
-----
Intel-based architectures don¹t necessarily add memory bandwidth as
they add cores. A single CPU on a system with limited memory
bandwidth can often saturate the memory bandwidth if it just moves a
big chunk of memory from here to there. It even has time to do
several arithmetic operations in between and still saturate the
memory. If your system is bandwidth-limited and the operation you
want to do involves moving a big chunk of data (bigger than the
caches) from here to there while doing a limited number of arithmetic
operations on it, adding cores cannot speed it up no matter how
clever the software is. Many Photoshop operations are in this
category, for instance.
....
The other issue is Amdahl¹s Law, described by computer architect Gene
Amdahl in the 1960s. Almost all algorithms that can be parallelized
also have some portion that must be done sequentially ‹ setup
(deciding how to divide the problem up among multiple cores) or
synchronization, or collecting and summarizing the results. At those
times each step depends on the step before being completed. As you
add processors and speed up the parallel part, the sequential part
inevitably takes up a larger percentage of the time. If 10% of the
problem is sequential, then even if you add an infinite number of
processors and get the other 90% of the problem done in zero time,
you can achieve at most a 10X speedup. And some algorithms are just
really hard or impossible to parallelize: calculating text layout on
a page is a commonly cited example.
....
Why does video rendering scale better than Photoshop? Rendering video
is typically done by taking some source image material for a frame
and performing a stack of adjustments and filters on it. Each frame
is only a few hundred thousand pixels (for standard definition) or at
most 2 megapixels or 8MB in 8-bit (for HD). Thus, particularly for
standard definition images, the cache gives a lot more benefit as a
sequence of operations are performed on each frame, and for each
frame, you fetch the data, do several operations, and write the final
result. Different frames can usually be rendered in parallel * one
per processor, and so each processor does a fair chunk of computation
for each byte read or written from memory.

By contrast, in Photoshop most time-consuming operations are
performed on a single image layer and the problem is the size of that
layer ‹ 30MB for an 8-bit image from a 10MP digital camera. 60MB if
you keep all the information by converting the raw file to 16 bit. Or
if you¹ve merged some Canon 1DSMkII images to HDR, that¹s over 200MB.
And of course the people most concerned with speeding up Photoshop
with more cores are the ones with the giant images. When you run a
Gaussian Blur on that giant image, the processor has to read all of
it from memory, perform a relatively few calculations, and then write
the result into newly allocated memory (so you can undo it). You can
work on different pieces of the image on different processors, but
you¹re not doing nearly as much computation on each byte fetched from
memory as in the video case.
....
To take good advantage of 8- or 16- core machines (for things other
than servers), we¹ll need machines whose bandwidth increases with the
number of cores, and we¹ll need problems that depend on doing
relatively large amounts of computation for each byte fetched from
main memory (yes, re-reading the same data you¹ve already fetched
into the caches counts). Complex video and audio signal processing
are good examples of these kinds of tasks. And we¹re always looking
for more useful things that Photoshop can do that are more
computationally intensive.


russell williams discusses it again:

https://cacm.acm.org/magazines/2010/...alability-keep
ing-it-simple/fulltext
Photoshop's parallelism, born in the era of specialized expansion
cards, has managed to scale well for the two- and four-core machines
that have emerged over the past decade. As Photoshop's engineers
prepare for the eight- and 16-core machines that are coming, however,
they have started to encounter more and more scaling problems,
primarily a result of the effects of Amdahl's Law and
memory-bandwidth limitations.
....
...All the synchronization lived in that multiprocessor plug-in that
inserted itself into the jump table for the bottleneck routines. That
architecture was put into Photoshop in about 1994. It allowed us to
take advantage of Windows NT's symmetric multiprocessing architecture
for either two or four CPUs, which was what you could get at the time
on a very high-end machine. It also allowed us to take advantage of
the DayStar multiprocessing API on the Macintosh. You could buy
multiprocessor machines from DayStar Digital in the mid-to late-1990s
that the rest of the Mac operating system had no way of taking
advantage of‹but Photoshop could.

Photoshop has well over a decade of using multiple processors to
perform the fundamental image-processing work it does on pixels. That
has scaled pretty well over the number of CPUs people have typically
been able to obtain in a system over the past decade‹which is to say
either two or four processors. Essentially, no synchronization bugs
ended up surfacing in those systems over all that time.
....
For the engineers on the Photoshop development team, the scaling
limitations imposed by Amdahl's Law have become all too familiar over
the past few years. Although the application's current parallelism
scheme has scaled well over two- and four-processor systems,
experiments with systems featuring eight or more processors indicate
performance improvements that are far less encouraging. That's partly
because as the number of cores increases, the image chunks being
processed, called tiles, end up getting sliced into a greater number
of smaller pieces, resulting in increased synchronization overhead.
Another issue is that in between each of the steps that process the
image data in parallelizable chunks, there are sequential bookkeeping
steps. Because of all this, Amdahl's Law quickly transforms into
Amdahl's wall.

Photoshop's engineers tried to mitigate these effects by increasing
the tile size, which in turn made each of the sub-pieces larger. This
helped to reduce the synchronization overhead, but it presented the
developers with yet another parallel-computing bugaboo:
memory-bandwidth limitations. Compounding the problem was the fact
that Photoshop cannot interrupt pixel-processing operations until an
entire tile is complete. So to go too far down the path of increasing
tile sizes would surely result in latency issues, as the application
would become unable to respond to user input until it had finished
processing an entire tile.
....
As a rule, we avoid writing large amounts of processor-specific or
manufacturer-specific code, although we do some targeted performance
tuning. For us, life will start to get interesting once we can use
libraries such as OpenGL, OpenCL, and Adobe's Pixel Bender‹or any
higher-level libraries that take advantage of these libraries‹to get
access to all that GPU power in a more general way.

We've also been looking at the change Intel's Nehalem architecture
presents in this area. On all previous multicore CPUs, a single
thread could soak up essentially all of the memory bandwidth. Given
that many of our low-level operations are memory-bandwidth limited,
threading them would have only added overhead. Our experience with
other multicore CPUs is that they become bandwidth limited with only
a couple of threads running, so parallel speedups are limited by
memory bandwidth rather than by Amdahl's Law or the nature of the
algorithm. With Nehalem, each processor core is limited to a fraction
of the chip's total memory bandwidth, so even a large memcpy can be
sped up tremendously by being multithreaded.


Now you may wish to claim that Adobe is "sophisticated",


absolutely.

but compared to how others have done distributed graphical
processing across multiple discrete GPU cards, Adobe's
current status is over a decade behind the start of the art."


nonsense. complete utter nonsense.


You're entitled to your own opinion, but not your own facts.


that goes for you, not me.

a decade ago i was writing photoshop plug-ins and was *very* familiar
with what photoshop does under the hood.

And now you're trying to rationalize why Adobe only supports but a
single GPU card with a "most people" statement: that's a reasonably
good case for business optimization - - but it simply does not support
a claim of a superior level of technological sophistication.


what matters is whether the end result is faster, not how many
gpus are used.


Copying what I've already said to try to claim it for yourself, we see.


nope. you're the one fixated on how many gpus are used, claiming it's a
matter of sophistication, then later contradicting yourself that it is
actually a business decision.

using multiple cores and/or gpus definitely helps in certain situations
(not always) and photoshop does exceptionally well at optimizing when
to offload and when not to.

using two gpus simply because two of them must be twice as good as one
is crazy. it's a bigger number so it must be better! doesn't work that
way.


But your claim was GPUs as a metric for _sophistication_.


that was *your* claim, not mine.

then you decided it was a business decision.

the same applies to multi-core. some (ignorant) people bitch about how
photoshop doesn't always use all available cores. the answer is because
sometimes it's faster to do a particular task on 1 or 2 cores than it
is on 4 or 8 cores.

photoshop is *incredibly* optimized, even tuned to specific versions of
processors.

to claim that adobe is not sophisticated is absurd.


Adobe does a pretty good job at optimization ... but within the box
that they've chosen to define for themselves.


adobe does an incredible job at optimization, among the most optimized
code available. it's downright scary the extents they go to optimize
it.

chris cox, who is responsible for most, if not all of the optimization,
is a madman. photoshop is even tuned for specific processor variants,
not just the processor family.

That has included
the choice to only support a single GPU, presumably because they
know that this higher level of sophistication is more difficult to
successfully code, and would be further towards the diminishing
returns in the marketplace where there's fewer customers who
need that bit more to be willing to pay for how much it costs.


it's more difficult, but that doesn't mean adobe lacks the
sophistication to do it.

it simply means that their resources are better spent optimizing
*other* parts of the app, where it has the most effect for the most
users.

Which, once again, is a business decision - - based on avoiding
the cost of technological sophistication that may not have a
large enough customer base to be willing to pay how much it
would cost for them to develop, test, and deploy.


a business decision is not the same as not being sophisticated enough
to do it.
  #123  
Old June 22nd 17, 11:54 AM posted to rec.photo.digital
-hh
external usenet poster
 
Posts: 838
Default Travel without a camera

On Thursday, June 22, 2017 at 12:41:17 AM UTC-4, nospam wrote:
-hh wrote:
[...]


'processing images' is vague.


Yes, it certainly is, for nondisclosures still apply.



Indeed, both leveraged GPU cards for parallelism ... but Adobe only
went to the level of sophistication of a single card.


first you say adobe doesn't have the sophistication to go further, then
you say it was a business decision.


Yes, as Adobe's likely reason for _why_.



they are two totally different apps, with two totally different
code bases and two totally different goals.


Different apps? Sure.
Different code bases? Of course.

Different goals? You can't claim that, because you don't
actually know what specifically this other product did.


i don't need to know specifically what your app did, but it's very safe
to say it did not do *everything* photoshop did ...


Yes, but also irrelevant: you're trying to move the goalposts now to feature count.

...nor did it do it in the
ways that photoshop did it, nor was it intended to be a mass market
consumer app selling millions and millions of copies. in other words,
the goals were different.


Yes (but of course), the goals were different. If the objectives could have
been satisfied with PS, then they simply would have bought a copy of PS
and saved time & money.



FYI, what it did can be replicated in PS, but was dog slow.


so what?


Because you whined that 'processing images' was vague and was
therefore non-comparable.


you're also oblivious to the fact that photoshop was using multiple
processors in the 1990s, ten years before you were working on that
project. there wasn't much gpu acceleration back then, but there were
dsp cards that dramatically accelerated photoshop.


Oh, I'm quite aware of PS on Dual-CPU Macs in that era ... but
let's also keep in mind that in this period, the CPUs were still
often also "single core". Even so, did Adobe ever progress to
the point of being able to be paralleled across 6 or 8 CPUs?


of course they did,


Got proof? Cite please, and do make sure that it is from the 1990's.

and they saw that there were bottlenecks elsewhere.
adding processors when something else is the bottleneck is a waste.


What generally happens for both CPU & GPU based parallelism is that use
of the technique invokes the law of Diminishing Returns: the necessary
increases in management overhead offsets the performance gains from N+1
additional parallelism. Invariably, these become larger than the incremental
gain, which makes the functional net performance curve be asymptotic.


john nack (then at adobe, now at google) blogged about it, with russell
williams going into detail:

http://blogs.adobe.com/jnack/2006/12..._photoshop_mul
ti_core.html
...


And regardless of the pedantic minutia, it functionally is still just essentially
nothing more than the law of diminishing returns being applied yet again.


a business decision is not the same as not being sophisticated enough
to do it.


When they won't pay for it (because its not a good business case), the
potential for higher sophistication went unrealized, which also means
that there's no tangible proof in the form of a final product that you or I
can see as proof of their sophistication.

TL;DR summary: you're trying to say, "see, these Adobe guys are really smart
and they've looked into this question and decided not to do X"... but another
bunch of guys (at a different corporation) went and did it anyway.


-hh
  #124  
Old June 22nd 17, 05:24 PM posted to rec.photo.digital
David B.[_3_]
external usenet poster
 
Posts: 96
Default Travel without a camera

On 22-Jun-17 10:49 AM, Whisky-dave wrote:
On Wednesday, 21 June 2017 19:01:24 UTC+1, David B. wrote:
On 21-Jun-17 3:43 PM, PAS wrote:
On 6/21/2017 10:22 AM, David B. wrote:
On 21-Jun-17 2:45 PM, PAS wrote:
I plan on getting an iMac within the next year to see how I like it.

I have absolutely no doubt that you will LOVE it! :-)

FWIW, I recommend that you splash out and go for the top of the range
27 inch iMac.

Who amongst us would not love a shiny new toy? I've always admired the
current design of the iMac. There is an Apple Store not far from where
I live and on the few occasions I find myself at the mall where it is, I
always stop in to check them out.


My son was, at that time an RAF helicopter pilot and instructor, serving
with the USAF at Kirtland Air Force Base in Albuquerque, New Mexico,
when the iMac was first launched in 2008.

A good article he- https://en.wikipedia.org/wiki/IMac

He was enthrawled after checking it out in the Apple store in
Albuquerque and bought one without hardly a second thought. He was
swayed, I'm sure, by the fact that Jonathan Ives, the designer, is a
fellow Brit! ;-)

https://en.wikipedia.org/wiki/Jonathan_Ive

Within days he was on the 'phone insistent that his dad should buy one
too - so I did! One of the best decisions I have _ever_ made. :-)

Sadly, I have to leave it at home when I'm on board my narrowboat


What differnce does a narrowboat make.


About 160 miles of motorway travel! ;-)

My wife would go spare if I wanted to bring my iMac with us in the car
each time we embarked; there's hardly room to skin a cat as it is! ;-)

I've a freind that was on a canal boat and he had a mac on it, and someone he knows has a mac on their boat. Those PCs users haven't been telling you you can't use a Mac on a narrowboat have they, I know what some of them are like on here :-) eric you've not been spreading rumours have you ?


You are absolutely right when you say there is no real reason one cannot
have a Mac on a boat - other than somewhat limited space, which *I*
share with my other half.

The temperature here this afternoon was/is 33 degrees Celcius -
outrageously hot for England!


Too bloody right, I blame brexit.


Haha! :-) Were you a remainer?!!!

--
Sometimes man stumbles over the truth. (W.Churchill)
  #125  
Old June 22nd 17, 10:14 PM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Travel without a camera

On Thu, 22 Jun 2017 02:49:42 -0700 (PDT), Whisky-dave
wrote:

On Wednesday, 21 June 2017 19:01:24 UTC+1, David B. wrote:
On 21-Jun-17 3:43 PM, PAS wrote:
On 6/21/2017 10:22 AM, David B. wrote:
On 21-Jun-17 2:45 PM, PAS wrote:
I plan on getting an iMac within the next year to see how I like it.

I have absolutely no doubt that you will LOVE it! :-)

FWIW, I recommend that you splash out and go for the top of the range
27 inch iMac.

Who amongst us would not love a shiny new toy? I've always admired the
current design of the iMac. There is an Apple Store not far from where
I live and on the few occasions I find myself at the mall where it is, I
always stop in to check them out.


My son was, at that time an RAF helicopter pilot and instructor, serving
with the USAF at Kirtland Air Force Base in Albuquerque, New Mexico,
when the iMac was first launched in 2008.

A good article he- https://en.wikipedia.org/wiki/IMac

He was enthrawled after checking it out in the Apple store in
Albuquerque and bought one without hardly a second thought. He was
swayed, I'm sure, by the fact that Jonathan Ives, the designer, is a
fellow Brit! ;-)

https://en.wikipedia.org/wiki/Jonathan_Ive

Within days he was on the 'phone insistent that his dad should buy one
too - so I did! One of the best decisions I have _ever_ made. :-)

Sadly, I have to leave it at home when I'm on board my narrowboat


What differnce does a narrowboat make.


I've a freind that was on a canal boat and he had a mac on it, and someone he knows has a mac on their boat. Those PCs users haven't been telling you you can't use a Mac on a narrowboat have they, I know what some of them are like on here :-) eric you've not been spreading rumours have you ?

It's got me puzzled, unless the narrow boat never moves and never gets
to run the engine to charge batteries. 'Get a gen set' sombody cries,
but the neighbours will object to the noise. Even windmills make
unpleasant neighbours. Solar panels might be the answer.


- like
now. The temperature here this afternoon was/is 33 degrees Celcius -
outrageously hot for England!


Too bloody right, I blame brexit.

--

Regards,

Eric Stevens
  #126  
Old June 22nd 17, 10:22 PM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Travel without a camera

On Thu, 22 Jun 2017 03:18:00 -0700 (PDT), Whisky-dave
wrote:

On Thursday, 22 June 2017 00:00:54 UTC+1, Eric Stevens wrote:
On Wed, 21 Jun 2017 04:56:19 -0700 (PDT), Whisky-dave
wrote:

On Wednesday, 21 June 2017 05:04:13 UTC+1, Eric Stevens wrote:



As I have previously explained, I have no ROI so that wasn't one of my
considerations. The plain fact of the matter was that I had an aging
computer and Adobe was rejecting my GPU.

when you brought the computer you didn't future proof it by buying a graphics card that adobe would support in the future, I thought that was your aim of future proofing.

It lasted me 6 years before I ran into that. Who is to complain?


But it was adobe that made it incompatable not the actual hardware.
What makes you think adobe won't rejecting your new GPU in a year or two. ?

As you will know by now, my GPU is near the head of the game. If Adobe
advances ahead of where it will go they will be render the vast
majority of other GPUs unsuitable for their purposes. I don't think
they will do that.
--

Regards,

Eric Stevens
  #127  
Old June 23rd 17, 04:13 AM posted to rec.photo.digital
android
external usenet poster
 
Posts: 3,854
Default Travel without a camera

In article ,
"David B." wrote:

You are absolutely right when you say there is no real reason one cannot
have a Mac on a boat - other than somewhat limited space, which *I*
share with my other half.


The iMacs has VESA mounts, IIRC. Put it on the wall and be done with
it...
--
teleportation kills
  #128  
Old June 27th 17, 01:24 AM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Travel without a camera

On Mon, 26 Jun 2017 02:18:21 -0700 (PDT), Whisky-dave
wrote:

On Thursday, 22 June 2017 22:22:05 UTC+1, Eric Stevens wrote:
On Thu, 22 Jun 2017 03:18:00 -0700 (PDT), Whisky-dave
wrote:

On Thursday, 22 June 2017 00:00:54 UTC+1, Eric Stevens wrote:
On Wed, 21 Jun 2017 04:56:19 -0700 (PDT), Whisky-dave
wrote:

On Wednesday, 21 June 2017 05:04:13 UTC+1, Eric Stevens wrote:



As I have previously explained, I have no ROI so that wasn't one of my
considerations. The plain fact of the matter was that I had an aging
computer and Adobe was rejecting my GPU.

when you brought the computer you didn't future proof it by buying a graphics card that adobe would support in the future, I thought that was your aim of future proofing.

It lasted me 6 years before I ran into that. Who is to complain?

But it was adobe that made it incompatable not the actual hardware.
What makes you think adobe won't rejecting your new GPU in a year or two. ?

As you will know by now, my GPU is near the head of the game.


yes but are you so sure that a new card will give you the performance boost you want/;expect was the point.


It has certainly given it back to me and the new machine is quicker
than the old at many minor tasks. The most time consuming task I have
is using LR to make a video from a batch of raw (NEF) files running as
a slide show. I had hoped it would be faster but it is not
conspicuously so. The performance window shows that it uses about 40%
of the CPU on average to do this. I tried it with making two
simultaneously and got no higher than about 60% of the CPU capacity on
average. However most of the time it carried on mumbling to itself
seeming not to be doing much of anything anywhere. The creation of and
conversion to a video does not appear to be a continuous process but a
series of batch operations which do not take full advantage of the
multiple CPU cores.

I would be concerned if from the outset LR or PS could load my new
machine to capacity. That would bode ill for load increases of the
future.

A few weeks ago we brought the 1060 card it's for a student VR project we have HD camera and headset and the card is needed for real time shading .

If Adobe
advances ahead of where it will go they will be render the vast
majority of other GPUs unsuitable for their purposes. I don't think
they will do that.


Probbably not I doubt adobe will write software that needs a particualar graphics card especailly a relatively high end one.


Even the lowest graphics cards of today would have been regarded as a
high end one ten years ago.

In eh early days of photoshop 2.5 to 3.0 with a Mac LC III I was getting posterisation times for a postcard 720 DPI of 30 seconds or so. The processor was a 25MHz, by changing a couple of service mount resistors on the logic board you could get the processor to run at 33MHz which I did and it did cut the speed of that file so it posterisation down to the expected 20 seconds.
They didnlt use GPUs then but adobe did use a FPU which the LC III didn;t have this would have really cut the speed down by a factor of 1000X or so.


If I were building a PC to run photoshop I'd get a reasonable gaphics card not spending much more than 150 UKPs then in 1-3 years look to upgrading (depending what adobe actually do) rather than spending 300+ and hoping adobe will write spftesare that can take advantages of all the knobs and whistles that a high end graphics card has, I just don't see adaobe putting that much effort into such things with realtivley little payback.


But I am looking forward to upgrading not in 1-3 years but +6 years,
and that makes a difference. Especially when you consider that my
screens may not last that long.
--

Regards,

Eric Stevens
  #129  
Old June 27th 17, 02:15 AM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Travel without a camera

In article , Eric Stevens
wrote:

If Adobe
advances ahead of where it will go they will be render the vast
majority of other GPUs unsuitable for their purposes. I don't think
they will do that.


Probbably not I doubt adobe will write software that needs a particualar
graphics card especailly a relatively high end one.


Even the lowest graphics cards of today would have been regarded as a
high end one ten years ago.


same for a computer. so what?

a modern phone or tablet is more capable than a desktop computer of a
decade ago.
  #130  
Old June 27th 17, 09:24 AM posted to rec.photo.digital
David B.[_3_]
external usenet poster
 
Posts: 96
Default Travel without a camera

On 26-Jun-17 10:25 AM, Whisky-dave wrote:
On Thursday, 22 June 2017 17:24:12 UTC+1, David B. wrote:
On 22-Jun-17 10:49 AM, Whisky-dave wrote:
On Wednesday, 21 June 2017 19:01:24 UTC+1, David B. wrote:
On 21-Jun-17 3:43 PM, PAS wrote:
On 6/21/2017 10:22 AM, David B. wrote:
On 21-Jun-17 2:45 PM, PAS wrote:
I plan on getting an iMac within the next year to see how I like it.

I have absolutely no doubt that you will LOVE it! :-)

FWIW, I recommend that you splash out and go for the top of the range
27 inch iMac.

Who amongst us would not love a shiny new toy? I've always admired the
current design of the iMac. There is an Apple Store not far from where
I live and on the few occasions I find myself at the mall where it is, I
always stop in to check them out.

My son was, at that time an RAF helicopter pilot and instructor, serving
with the USAF at Kirtland Air Force Base in Albuquerque, New Mexico,
when the iMac was first launched in 2008.

A good article he- https://en.wikipedia.org/wiki/IMac

He was enthrawled after checking it out in the Apple store in
Albuquerque and bought one without hardly a second thought. He was
swayed, I'm sure, by the fact that Jonathan Ives, the designer, is a
fellow Brit! ;-)

https://en.wikipedia.org/wiki/Jonathan_Ive

Within days he was on the 'phone insistent that his dad should buy one
too - so I did! One of the best decisions I have _ever_ made. :-)

Sadly, I have to leave it at home when I'm on board my narrowboat

What differnce does a narrowboat make.


About 160 miles of motorway travel! ;-)

My wife would go spare if I wanted to bring my iMac with us in the car
each time we embarked; there's hardly room to skin a cat as it is! ;-)


Well there's an obvious answer to that problem ........


Haha! :-) I'm not going to buy a bigger car (or leave my other half at
home! ;-) )

You are absolutely right when you say there is no real reason one cannot
have a Mac on a boat - other than somewhat limited space, which *I*
share with my other half.


It sounds like a certain half is taking up rather too much valuable space


Perhaps you're right - but I'd miss her after being married for over 50
years! ;-)

--
Sometimes man stumbles over the truth. (W.Churchill)
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Travel Camera Trundle Digital Photography 14 November 24th 14 08:00 PM
Ideal travel camera... AKT Digital Photography 5 November 18th 07 08:11 PM
Air Travel with LF Camera Ron Gans Large Format Photography Equipment 17 April 10th 07 10:34 PM
Need New Travel Camera rhonda Digital Photography 4 August 4th 06 04:56 PM
Digital travel camera [email protected] Film & Labs 0 January 29th 04 05:33 AM


All times are GMT +1. The time now is 10:52 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.