A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Travel without a camera



 
 
Thread Tools Display Modes
  #141  
Old June 28th 17, 01:40 AM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Travel without a camera

On Mon, 26 Jun 2017 21:15:13 -0400, nospam
wrote:

In article , Eric Stevens
wrote:

If Adobe
advances ahead of where it will go they will be render the vast
majority of other GPUs unsuitable for their purposes. I don't think
they will do that.

Probbably not I doubt adobe will write software that needs a particualar
graphics card especailly a relatively high end one.


Even the lowest graphics cards of today would have been regarded as a
high end one ten years ago.


same for a computer. so what?

a modern phone or tablet is more capable than a desktop computer of a
decade ago.

--

Regards,

Eric Stevens
  #142  
Old June 28th 17, 03:02 AM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Travel without a camera

On Tue, 27 Jun 2017 02:28:09 -0700 (PDT), Whisky-dave
wrote:

On Tuesday, 27 June 2017 01:24:21 UTC+1, Eric Stevens wrote:
On Mon, 26 Jun 2017 02:18:21 -0700 (PDT), Whisky-dave
wrote:

On Thursday, 22 June 2017 22:22:05 UTC+1, Eric Stevens wrote:
On Thu, 22 Jun 2017 03:18:00 -0700 (PDT), Whisky-dave
wrote:

On Thursday, 22 June 2017 00:00:54 UTC+1, Eric Stevens wrote:
On Wed, 21 Jun 2017 04:56:19 -0700 (PDT), Whisky-dave
wrote:

On Wednesday, 21 June 2017 05:04:13 UTC+1, Eric Stevens wrote:



As I have previously explained, I have no ROI so that wasn't one of my
considerations. The plain fact of the matter was that I had an aging
computer and Adobe was rejecting my GPU.

when you brought the computer you didn't future proof it by buying a graphics card that adobe would support in the future, I thought that was your aim of future proofing.

It lasted me 6 years before I ran into that. Who is to complain?

But it was adobe that made it incompatable not the actual hardware.
What makes you think adobe won't rejecting your new GPU in a year or two. ?

As you will know by now, my GPU is near the head of the game.

yes but are you so sure that a new card will give you the performance boost you want/;expect was the point.


It has certainly given it back to me and the new machine is quicker
than the old at many minor tasks. The most time consuming task I have
is using LR to make a video from a batch of raw (NEF) files running as
a slide show. I had hoped it would be faster but it is not
conspicuously so.


I think video makes use of hyperthreading on a Mac with an i7 which is why I wanted an i7 and I don;t think the graphics card has any real world effect on speading up such a process.


But the 1070 also gives me 10 bit graphics and large screens.

The performance window shows that it uses about 40%
of the CPU on average to do this. I tried it with making two
simultaneously and got no higher than about 60% of the CPU capacity on
average.


So that might indicate even iof yuo spend a miillion on a grphics card with a billion GPUs it won't be any faster than a standard graphics card.
Buying a faster main processor CPU would speed things up a little.
I think this is where the peak performace or turbo boost has an advantage helps

However most of the time it carried on mumbling to itself
seeming not to be doing much of anything anywhere.


I have students like that

The creation of and
conversion to a video does not appear to be a continuous process but a
series of batch operations which do not take full advantage of the
multiple CPU cores.


Even on the Mac with dual CPUs it took a while for software componies to make use of them so buying them didnlt give you asn immediate speed incresese and by they time they did make use of them CPU processor speeds had increased at a faster rate.

what yuo'd need to find out is the actual processes LR uses, I've no idea a friend manuall tried to make a slide show movie, she used stills of high res. 360 DPI put 25 of them to make a 1 second clip and the movie only lasted less than 10 seconds she couldn;t understand why, but they file size for the 10 secs had exceeded the 4GB file size for a single file.
You can get a whole feature film in HD if not 4K for 4GB

I would be concerned if from the outset LR or PS could load my new
machine to capacity. That would bode ill for load increases of the
future.


Well soemtimes it's fun to see how much load you can put on a machine but you have to be realistic.
When I play a game I can get 400FPS at 5k or so but the fans really kick in and it;s noticabley noiser due to this and the iMac heats up more than usual, but I don;t really need 400 FPS 50 is fine for playing such a game even 30 is OK.

A few weeks ago we brought the 1060 card it's for a student VR project we have HD camera and headset and the card is needed for real time shading .

If Adobe
advances ahead of where it will go they will be render the vast
majority of other GPUs unsuitable for their purposes. I don't think
they will do that.

Probbably not I doubt adobe will write software that needs a particualar graphics card especailly a relatively high end one.


Even the lowest graphics cards of today would have been regarded as a
high end one ten years ago.


But that isn't the point is it as the lowered end graphics cards today wouldn't have worked in PCs of 10 years ago.


Probably wouldn't have been recognised by the bus.


In eh early days of photoshop 2.5 to 3.0 with a Mac LC III I was getting posterisation times for a postcard 720 DPI of 30 seconds or so. The processor was a 25MHz, by changing a couple of service mount resistors on the logic board you could get the processor to run at 33MHz which I did and it did cut the speed of that file so it posterisation down to the expected 20 seconds.
They didnlt use GPUs then but adobe did use a FPU which the LC III didn;t have this would have really cut the speed down by a factor of 1000X or so.


If I were building a PC to run photoshop I'd get a reasonable gaphics card not spending much more than 150 UKPs then in 1-3 years look to upgrading (depending what adobe actually do) rather than spending 300+ and hoping adobe will write spftesare that can take advantages of all the knobs and whistles that a high end graphics card has, I just don't see adaobe putting that much effort into such things with realtivley little payback.


But I am looking forward to upgrading not in 1-3 years but +6 years,


why wait so long.


I dunno: it's just that, except for one unfortunate machine, I've
always done it that way.

and that makes a difference. Especially when you consider that my
screens may not last that long.


so why wait 6 years.


Why not?

Just think if yuo'd brought a high end computer 20 years ago.
Would it had been worth getting a high spec machine in 1997 then waiting 20 years to upgrade or would it be better to upgrade every couple of years with a less state of the art computer, that's the way I'd go Mac or PC TV or washing machine.

The higher end options come down in price quicker and to a larger degree too.


Because they cost more to begin with. But a low end machine comes down
to rock-bottom value extremely quickly.
--

Regards,

Eric Stevens
  #143  
Old June 28th 17, 08:26 AM posted to rec.photo.digital
David Taylor
external usenet poster
 
Posts: 1,146
Default Travel without a camera

On 27/06/2017 17:19, David B. wrote:
[] Thank you. :-)

I've read he
https://manage.realvnc.com/pricing?_...848.1498580023

Perhaps I'll try that once I'm back home again, especially as there is
no charge for Home Users!


Be aware that VNC may not work except on the same LAN segment, unless
you reconfigure your firewall.

--
Cheers,
David
Web: http://www.satsignal.eu
  #144  
Old June 28th 17, 08:44 AM posted to rec.photo.digital
David B.[_3_]
external usenet poster
 
Posts: 96
Default Travel without a camera

On 28-Jun-17 8:26 AM, David Taylor wrote:
On 27/06/2017 17:19, David B. wrote:
[] Thank you. :-)

I've read he
https://manage.realvnc.com/pricing?_...848.1498580023

Perhaps I'll try that once I'm back home again, especially as there is
no charge for Home Users!


Be aware that VNC may not work except on the same LAN segment, unless
you reconfigure your firewall.


Thank you, David. :-) I'll do my best to remember that!

When I have a bit more time, I'll have a proper look at your
comprehensive web site. I was saddened to read that you have been ill,
but trust you are now much better.

--
Regards,
David B.
  #145  
Old June 29th 17, 12:58 AM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Travel without a camera

On Wed, 28 Jun 2017 02:42:01 -0700 (PDT), Whisky-dave
wrote:

On Wednesday, 28 June 2017 03:02:38 UTC+1, Eric Stevens wrote:
On Tue, 27 Jun 2017 02:28:09 -0700 (PDT), Whisky-dave
wrote:

On Tuesday, 27 June 2017 01:24:21 UTC+1, Eric Stevens wrote:
On Mon, 26 Jun 2017 02:18:21 -0700 (PDT), Whisky-dave
wrote:

On Thursday, 22 June 2017 22:22:05 UTC+1, Eric Stevens wrote:
On Thu, 22 Jun 2017 03:18:00 -0700 (PDT), Whisky-dave
wrote:

On Thursday, 22 June 2017 00:00:54 UTC+1, Eric Stevens wrote:
On Wed, 21 Jun 2017 04:56:19 -0700 (PDT), Whisky-dave
wrote:

On Wednesday, 21 June 2017 05:04:13 UTC+1, Eric Stevens wrote:



As I have previously explained, I have no ROI so that wasn't one of my
considerations. The plain fact of the matter was that I had an aging
computer and Adobe was rejecting my GPU.

when you brought the computer you didn't future proof it by buying a graphics card that adobe would support in the future, I thought that was your aim of future proofing.

It lasted me 6 years before I ran into that. Who is to complain?

But it was adobe that made it incompatable not the actual hardware.
What makes you think adobe won't rejecting your new GPU in a year or two. ?

As you will know by now, my GPU is near the head of the game.

yes but are you so sure that a new card will give you the performance boost you want/;expect was the point.

It has certainly given it back to me and the new machine is quicker
than the old at many minor tasks. The most time consuming task I have
is using LR to make a video from a batch of raw (NEF) files running as
a slide show. I had hoped it would be faster but it is not
conspicuously so.

I think video makes use of hyperthreading on a Mac with an i7 which is why I wanted an i7 and I don;t think the graphics card has any real world effect on speading up such a process.


But the 1070 also gives me 10 bit graphics and large screens.


So have a lot of other cards used by gamers for the last 15+ years.


Very few, in fact, and at elevated prices. I know, I have been
looking.

As for large screen how large ? are you planing on running a cinema or something ?


Yes, but the cinema is very small. Just 8 bits per seat.







But I am looking forward to upgrading not in 1-3 years but +6 years,

why wait so long.


I dunno: it's just that, except for one unfortunate machine, I've
always done it that way.


Well computer processing power roughly doubles every 18 months.
In the 90s I brought a new computer every 18 months to 2 years because of what the next model up could do that the previous couldn't , this is less noticable now I've found, so I don't upgrade as often now.

I have simialr views on upgrading most things including cameras, I don't have a fixed term that says in X years I will update I waited until the step up was reasonable and worthwhile.
computer wise
A friend only keeps his graphics card for about 2 years then sells it on ebay while it's still got some value on puts it towards a new graphics card. That we he keeps reasonable up to date without having to spend a fortune, he;s normally a year behind the current best but 5 years up from having old equipment that no one else wants so it goes in the skip.

So what will you do with the old PC & graphics card ?


My wife uses it.


and that makes a difference. Especially when you consider that my
screens may not last that long.

so why wait 6 years.


Why not?


Because it would most likely be more effcinet to upgrade after say 3 years and being able to sell your old or grapphics card and buy a new one.

People have been doing this with cars for years (but for differnt reasons mostly) Those with a lot of disposable money buy a new car straight out of teh showroom, keep it a 1-3 years then sell it on or part exchange it and get another new car, their old car is brought secondhand by someone that isn't so into the lastest model and is happy to save 40% on a 3 year old car.
he will be selling his 6-9 year old car or maybe pass it down the family.
The person who gets this will most likly scrape it in a couple of years or drive it into a mount of soil and stones as a curiosity or art project.



Just think if yuo'd brought a high end computer 20 years ago.
Would it had been worth getting a high spec machine in 1997 then waiting 20 years to upgrade or would it be better to upgrade every couple of years with a less state of the art computer, that's the way I'd go Mac or PC TV or washing machine.

The higher end options come down in price quicker and to a larger degree too.


Because they cost more to begin with. But a low end machine comes down
to rock-bottom value extremely quickly.


yes exactly, so why hang on to it until dog **** is worth more.

But I don't.
--

Regards,

Eric Stevens
  #146  
Old June 29th 17, 04:50 AM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Travel without a camera

In article , David Taylor
wrote:

Be aware that VNC may not work except on the same LAN segment, unless
you reconfigure your firewall.


which takes maybe 10 seconds. maybe a minute if you have to look up the
port number.
  #147  
Old June 30th 17, 05:14 AM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Travel without a camera

On Thu, 29 Jun 2017 02:37:31 -0700 (PDT), Whisky-dave
wrote:

On Thursday, 29 June 2017 00:58:17 UTC+1, Eric Stevens wrote:
On Wed, 28 Jun 2017 02:42:01 -0700 (PDT), Whisky-dave
wrote:


I think video makes use of hyperthreading on a Mac with an i7 which is why I wanted an i7 and I don;t think the graphics card has any real world effect on speading up such a process.

But the 1070 also gives me 10 bit graphics and large screens.

So have a lot of other cards used by gamers for the last 15+ years.


Very few, in fact, and at elevated prices. I know, I have been
looking.


Maybe you've been looking in the wrong places.

https://www.neowin.net/forum/topic/1...for-photoshop/
Posted February 18, 2016

Actually, there are quite a few GPUs with 10-bit output - both AMD and nVidia have been shipping them, for average-Joe use, for years.

Every "gamer" GPU since at least 2000 - if not earlier - supports OpenGL AND DirectX because those same GPUs are also used in workstations. nVidia Quadro and AMD FirePro use the same GPUs their gaming cards do; to a large extent, their drivers can cross over as well (in most Linux distributions, in fact, they completely cross over). The high-end of Quadro today uses the same Maxwell 2.0 GPUs as the GTX9xx series; where they differ is the amount of GDDR5.

Will the PC in question EVER be used for gaming? If so, you should seriously consider the GTX970 - not only does it have the horsepower it needs to run DirectX gaming in 4K, it's almost overkill for Photoshop - or AutoCAD/SolidWorks, for that matter

--------------------------------

The GTX970 is about 1/5th the price of yuor card at that was in 2016.


The GTX 900 series has been succeeded by the GTX 1000 series. I have
bought the succesor to the 970; i.e. the 1070.

See also https://www.dpreview.com/forums/thread/4084658

"NVIDIA Geforce graphics cards have offered 10-bit per color out to
a full screen Direct X surface since the Geforce 200 series GPUs.
Due to the way most applications use traditional Windows API
functions to create the application UI and viewport display, this
method is not used for professional applications such as Adobe
Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit
per color buffers which require an NVIDIA Quadro GPU with
DisplayPort connector. A small number of monitors support 10-bit
per color with Quadro graphics cards over DVI. For more information
on NVIDIA professional line of Quadro GPUs, please visit:"

The GTX 1070 seems to be a very recent exception.


As for large screen how large ? are you planing on running a cinema or something ?


Yes, but the cinema is very small. Just 8 bits per seat.


Is that at a resoloution of one bum[1] per seat ;-)


Yep. And the screen has a diagonal of 25".


Where one british bum is roughly equvalant to one american fanny/backside/arse.

--

Regards,

Eric Stevens
  #148  
Old July 2nd 17, 02:49 AM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Travel without a camera

In article , -hh
wrote:


'processing images' is vague.


Yes, it certainly is, for nondisclosures still apply.


then you can't cite it.

for all we know, it doesn't even exist.

Indeed, both leveraged GPU cards for parallelism ... but Adobe only
went to the level of sophistication of a single card.


first you say adobe doesn't have the sophistication to go further, then
you say it was a business decision.


Yes, as Adobe's likely reason for _why_.


and a very valid reason.

that doesn't mean adobe lacks sophistication. it simply means they have
better things to do than worry about a tiny inconsequential number of
users, no matter how whiney they may be.

they are two totally different apps, with two totally different
code bases and two totally different goals.

Different apps? Sure.
Different code bases? Of course.

Different goals? You can't claim that, because you don't
actually know what specifically this other product did.


i don't need to know specifically what your app did, but it's very safe
to say it did not do *everything* photoshop did ...


Yes, but also irrelevant: you're trying to move the goalposts now to feature
count.


it's exactly relevant and not about feature count.

two very different goals.

in fact, photoshop was so complex that apple's own xcode couldn't even
compile it (while codewarrior could). apple had to modify xcode so that
adobe could start to use it.

...nor did it do it in the
ways that photoshop did it, nor was it intended to be a mass market
consumer app selling millions and millions of copies. in other words,
the goals were different.


Yes (but of course), the goals were different.


that's what i said.

If the objectives could have
been satisfied with PS, then they simply would have bought a copy of PS
and saved time & money.


except you won't say what those objectives are...

FYI, what it did can be replicated in PS, but was dog slow.


so what?


Because you whined that 'processing images' was vague and was
therefore non-comparable.


not as vague as an app you can't talk about.

you're also oblivious to the fact that photoshop was using multiple
processors in the 1990s, ten years before you were working on that
project. there wasn't much gpu acceleration back then, but there were
dsp cards that dramatically accelerated photoshop.

Oh, I'm quite aware of PS on Dual-CPU Macs in that era ... but
let's also keep in mind that in this period, the CPUs were still
often also "single core". Even so, did Adobe ever progress to
the point of being able to be paralleled across 6 or 8 CPUs?


of course they did,


Got proof? Cite please, and do make sure that it is from the 1990's.


read the links i provided.

and they saw that there were bottlenecks elsewhere.
adding processors when something else is the bottleneck is a waste.


What generally happens for both CPU & GPU based parallelism is that use
of the technique invokes the law of Diminishing Returns: the necessary
increases in management overhead offsets the performance gains from N+1
additional parallelism. Invariably, these become larger than the incremental
gain, which makes the functional net performance curve be asymptotic.


it's a lot more than that.

john nack (then at adobe, now at google) blogged about it, with russell
williams going into detail:

http://blogs.adobe.com/jnack/2006/12..._photoshop_mul
ti_core.html
...


And regardless of the pedantic minutia, it functionally is still just
essentially
nothing more than the law of diminishing returns being applied yet again.


it's a lot more than that.

a business decision is not the same as not being sophisticated enough
to do it.


When they won't pay for it (because its not a good business case), the
potential for higher sophistication went unrealized, which also means
that there's no tangible proof in the form of a final product that you or I
can see as proof of their sophistication.

TL;DR summary: you're trying to say, "see, these Adobe guys are really smart
and they've looked into this question and decided not to do X"... but another
bunch of guys (at a different corporation) went and did it anyway.


two different companies with different goals and different constraints.
  #149  
Old July 2nd 17, 04:12 PM posted to rec.photo.digital
PeterN[_6_]
external usenet poster
 
Posts: 4,254
Default Travel without a camera

On 7/1/2017 9:49 PM, nospam wrote:
In article , -hh
wrote:


'processing images' is vague.


Yes, it certainly is, for nondisclosures still apply.


then you can't cite it.

for all we know, it doesn't even exist.

Indeed, both leveraged GPU cards for parallelism ... but Adobe only
went to the level of sophistication of a single card.

first you say adobe doesn't have the sophistication to go further, then
you say it was a business decision.


Yes, as Adobe's likely reason for _why_.


and a very valid reason.

that doesn't mean adobe lacks sophistication. it simply means they have
better things to do than worry about a tiny inconsequential number of
users, no matter how whiney they may be.

they are two totally different apps, with two totally different
code bases and two totally different goals.

Different apps? Sure.
Different code bases? Of course.

Different goals? You can't claim that, because you don't
actually know what specifically this other product did.

i don't need to know specifically what your app did, but it's very safe
to say it did not do *everything* photoshop did ...


Yes, but also irrelevant: you're trying to move the goalposts now to feature
count.


it's exactly relevant and not about feature count.

two very different goals.

in fact, photoshop was so complex that apple's own xcode couldn't even
compile it (while codewarrior could). apple had to modify xcode so that
adobe could start to use it.

...nor did it do it in the
ways that photoshop did it, nor was it intended to be a mass market
consumer app selling millions and millions of copies. in other words,
the goals were different.


Yes (but of course), the goals were different.


that's what i said.

If the objectives could have
been satisfied with PS, then they simply would have bought a copy of PS
and saved time & money.


except you won't say what those objectives are...

FYI, what it did can be replicated in PS, but was dog slow.

so what?


Because you whined that 'processing images' was vague and was
therefore non-comparable.


not as vague as an app you can't talk about.

you're also oblivious to the fact that photoshop was using multiple
processors in the 1990s, ten years before you were working on that
project. there wasn't much gpu acceleration back then, but there were
dsp cards that dramatically accelerated photoshop.

Oh, I'm quite aware of PS on Dual-CPU Macs in that era ... but
let's also keep in mind that in this period, the CPUs were still
often also "single core". Even so, did Adobe ever progress to
the point of being able to be paralleled across 6 or 8 CPUs?

of course they did,


Got proof? Cite please, and do make sure that it is from the 1990's.


read the links i provided.

and they saw that there were bottlenecks elsewhere.
adding processors when something else is the bottleneck is a waste.


What generally happens for both CPU & GPU based parallelism is that use
of the technique invokes the law of Diminishing Returns: the necessary
increases in management overhead offsets the performance gains from N+1
additional parallelism. Invariably, these become larger than the incremental
gain, which makes the functional net performance curve be asymptotic.


it's a lot more than that.

john nack (then at adobe, now at google) blogged about it, with russell
williams going into detail:

http://blogs.adobe.com/jnack/2006/12..._photoshop_mul
ti_core.html
...


And regardless of the pedantic minutia, it functionally is still just
essentially
nothing more than the law of diminishing returns being applied yet again.


it's a lot more than that.

a business decision is not the same as not being sophisticated enough
to do it.


When they won't pay for it (because its not a good business case), the
potential for higher sophistication went unrealized, which also means
that there's no tangible proof in the form of a final product that you or I
can see as proof of their sophistication.

TL;DR summary: you're trying to say, "see, these Adobe guys are really smart
and they've looked into this question and decided not to do X"... but another
bunch of guys (at a different corporation) went and did it anyway.


two different companies with different goals and different constraints.




--
PeterN
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Travel Camera Trundle Digital Photography 14 November 24th 14 08:00 PM
Ideal travel camera... AKT Digital Photography 5 November 18th 07 08:11 PM
Air Travel with LF Camera Ron Gans Large Format Photography Equipment 17 April 10th 07 10:34 PM
Need New Travel Camera rhonda Digital Photography 4 August 4th 06 04:56 PM
Digital travel camera [email protected] Film & Labs 0 January 29th 04 05:33 AM


All times are GMT +1. The time now is 10:25 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.