If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#111
|
|||
|
|||
Travel without a camera
On Wednesday, June 21, 2017 at 11:27:19 AM UTC-4, nospam wrote:
-hh wrote: nospam wrote: much of what photoshop does is i/o bound, which can't be parallelized. But if it was I/O bound, then moving a computation from CPU to GPU card & back wouldn't result in a performance gain. many times it doesn't, and there is also the overhead of moving it back and forth too. If PS is as I/O bound as you claimed, then in all of those instances, it would never pay to offload to a GPU because the delay in waiting for the bandwidth to send/fetch would exceed the parallelism speed-up. photoshop uses the gpu only when it will accelerate a given operation. if that operation would be faster on the cpu, then that's where it will run. Which as per your "much of the time" bandwidth claim, will be rare-to-never and as such, a moot point. Overall, Adobe's not really a particularly sophisticated user of GPU potential nonsense. Oh, and you were doing so well! The cited Adobe webpage made it pretty clear that their software is limited to only using one GPU card at a time. most people have one gpu card. it's optimized for the common case. plus, a second gpu isn't necessarily better. again, not everything benefits from one gpu, let alone two. lots of apps don't use multiple gpus. adobe isn't unique in that regard. Irrelevant & a distraction attempt. nope. it's reality. You're trying to deflect from admitting that you were wrong in disagreeing with my statement that Adobe is not a particularly sophisticated user of GPU potential. of course i disagree. that statement is flat out absurd. (and you conveniently snipped that text) As I had said: "Contrasting that limitation on sophistication, I can personally recall working on a project with image analysis that used multiple discrete GPU cards (IIRC, ~8) ... way back in 2004. *that* is irrelevant (and why it was snipped). what you did with some random analysis app in 2004 has absolutely nothing whatsoever to do with photoshop. Incorrect, because both are processing images, and have a design goal of leveraging PC hardware to do it quickly. Indeed, both leveraged GPU cards for parallelism ... but Adobe only went to the level of sophistication of a single card. they are two totally different apps, with two totally different code bases and two totally different goals. Different apps? Sure. Different code bases? Of course. Different goals? You can't claim that, because you don't actually know what specifically this other product did. FYI, what it did can be replicated in PS, but was dog slow. you're also oblivious to the fact that photoshop was using multiple processors in the 1990s, ten years before you were working on that project. there wasn't much gpu acceleration back then, but there were dsp cards that dramatically accelerated photoshop. Oh, I'm quite aware of PS on Dual-CPU Macs in that era ... but let's also keep in mind that in this period, the CPUs were still often also "single core". Even so, did Adobe ever progress to the point of being able to be paralleled across 6 or 8 CPUs? Now you may wish to claim that Adobe is "sophisticated", absolutely. but compared to how others have done distributed graphical processing across multiple discrete GPU cards, Adobe's current status is over a decade behind the start of the art." nonsense. complete utter nonsense. You're entitled to your own opinion, but not your own facts. And now you're trying to rationalize why Adobe only supports but a single GPU card with a "most people" statement: that's a reasonably good case for business optimization - - but it simply does not support a claim of a superior level of technological sophistication. what matters is whether the end result is faster, not how many gpus are used. Copying what I've already said to try to claim it for yourself, we see. using two gpus simply because two of them must be twice as good as one is crazy. it's a bigger number so it must be better! doesn't work that way. But your claim was GPUs as a metric for _sophistication_. the same applies to multi-core. some (ignorant) people bitch about how photoshop doesn't always use all available cores. the answer is because sometimes it's faster to do a particular task on 1 or 2 cores than it is on 4 or 8 cores. photoshop is *incredibly* optimized, even tuned to specific versions of processors. to claim that adobe is not sophisticated is absurd. Adobe does a pretty good job at optimization ... but within the box that they've chosen to define for themselves. That has included the choice to only support a single GPU, presumably because they know that this higher level of sophistication is more difficult to successfully code, and would be further towards the diminishing returns in the marketplace where there's fewer customers who need that bit more to be willing to pay for how much it costs. Which, once again, is a business decision - - based on avoiding the cost of technological sophistication that may not have a large enough customer base to be willing to pay how much it would cost for them to develop, test, and deploy. -hh |
#112
|
|||
|
|||
Travel without a camera
On 6/21/2017 4:01 PM, PAS wrote:
On 6/21/2017 2:14 PM, nospam wrote: In article , PAS wrote: Let's assume we have a Mac and a Windows PC with similar hardware specs. How would the memory performance be different and why would it be different? I'm ignorant as to how a Mac OS utilizes hardware. If we take an SSD out of the equation for both comparable systems, is the performance any different? why take the ssd out of the equation? ssds on the latest macs are much faster than the ssds usually found on windows systems, especially if they're using a sata ssd, benchmarking in the range of 3 gigabytes (not bits) per second. I'm taking the SSD out of the equation just for the sake of comparison. Comparison #1 is with an SSD, comparison #2 is without SSD. For argument's sake, using comparable hardware with an SSD, let's say that Mac OS runs 10% faster than Windows. Now, using comparable hardware with a "regular" hard drive, does the Mac still run 10% faster or does that margin go down to 5%? There could no performance differences at all, I don't know. no idea, and it can vary depending on what specifically you're doing, so it's not worth worrying about. what matters is how productive you are at doing what it is you want to do and the overall user experience (something that doesn't show up on a benchmark). another thing you may not realize (and part of the user experience) is that when you first boot the imac, as part of the setup process, it will ask if you want to migrate from an existing computer or set it up as new. you can choose your windows computer as a source, and it will copy all of your email (and isp settings), browser bookmarks, contacts, photos, etc. it's not anywhere near as complete as a mac migration (which includes everything), but it's the best it can do. https://support.apple.com/en-us/HT204087 The migration from Windows to the Mac may be as complete as a Mac migration but t doesn't sound too shabby. How long have they had that feature? Should have read "The migration from Windows to the Mac *may not* be as complete..." |
#113
|
|||
|
|||
Travel without a camera
On Jun 21, 2017, hh wrote
(in ): On Wednesday, June 21, 2017 at 11:27:19 AM UTC-4, nospam wrote: -hh wrote: nospam wrote: much of what photoshop does is i/o bound, which can't be parallelized. But if it was I/O bound, then moving a computation from CPU to GPU card & back wouldn't result in a performance gain. many times it doesn't, and there is also the overhead of moving it back and forth too. If PS is as I/O bound as you claimed, then in all of those instances, it would never pay to offload to a GPU because the delay in waiting for the bandwidth to send/fetch would exceed the parallelism speed-up. photoshop uses the gpu only when it will accelerate a given operation. if that operation would be faster on the cpu, then that's where it will run. Which as per your "much of the time" bandwidth claim, will be rare-to-never and as such, a moot point. Overall, Adobe's not really a particularly sophisticated user of GPU potential nonsense. Oh, and you were doing so well! The cited Adobe webpage made it pretty clear that their software is limited to only using one GPU card at a time. most people have one gpu card. it's optimized for the common case. plus, a second gpu isn't necessarily better. again, not everything benefits from one gpu, let alone two. lots of apps don't use multiple gpus. adobe isn't unique in that regard. Irrelevant & a distraction attempt. nope. it's reality. You're trying to deflect from admitting that you were wrong in disagreeing with my statement that Adobe is not a particularly sophisticated user of GPU potential. of course i disagree. that statement is flat out absurd. (and you conveniently snipped that text) As I had said: "Contrasting that limitation on sophistication, I can personally recall working on a project with image analysis that used multiple discrete GPU cards (IIRC, ~8) ... way back in 2004. *that* is irrelevant (and why it was snipped). what you did with some random analysis app in 2004 has absolutely nothing whatsoever to do with photoshop. Incorrect, because both are processing images, and have a design goal of leveraging PC hardware to do it quickly. Indeed, both leveraged GPU cards for parallelism ... but Adobe only went to the level of sophistication of a single card. they are two totally different apps, with two totally different code bases and two totally different goals. Different apps? Sure. Different code bases? Of course. Different goals? You can't claim that, because you don't actually know what specifically this other product did. FYI, what it did can be replicated in PS, but was dog slow. you're also oblivious to the fact that photoshop was using multiple processors in the 1990s, ten years before you were working on that project. there wasn't much gpu acceleration back then, but there were dsp cards that dramatically accelerated photoshop. Oh, I'm quite aware of PS on Dual-CPU Macs in that era ... but let's also keep in mind that in this period, the CPUs were still often also "single core". Even so, did Adobe ever progress to the point of being able to be paralleled across 6 or 8 CPUs? Now you may wish to claim that Adobe is "sophisticated", absolutely. but compared to how others have done distributed graphical processing across multiple discrete GPU cards, Adobe's current status is over a decade behind the start of the art." nonsense. complete utter nonsense. You're entitled to your own opinion, but not your own facts. And now you're trying to rationalize why Adobe only supports but a single GPU card with a "most people" statement: that's a reasonably good case for business optimization - - but it simply does not support a claim of a superior level of technological sophistication. what matters is whether the end result is faster, not how many gpus are used. Copying what I've already said to try to claim it for yourself, we see. using two gpus simply because two of them must be twice as good as one is crazy. it's a bigger number so it must be better! doesn't work that way. But your claim was GPUs as a metric for _sophistication_. the same applies to multi-core. some (ignorant) people bitch about how photoshop doesn't always use all available cores. the answer is because sometimes it's faster to do a particular task on 1 or 2 cores than it is on 4 or 8 cores. photoshop is *incredibly* optimized, even tuned to specific versions of processors. to claim that adobe is not sophisticated is absurd. Adobe does a pretty good job at optimization ... but within the box that they've chosen to define for themselves. That has included the choice to only support a single GPU, presumably because they know that this higher level of sophistication is more difficult to successfully code, and would be further towards the diminishing returns in the marketplace where there's fewer customers who need that bit more to be willing to pay for how much it costs. Which, once again, is a business decision - - based on avoiding the cost of technological sophistication that may not have a large enough customer base to be willing to pay how much it would cost for them to develop, test, and deploy. -hh You might find this interesting reading: https://helpx.adobe.com/photoshop/kb/photoshop-cc-gpu-card-faq.html http://www.nvidia.com/object/adobe-photoshop-cc.html https://www.pugetsystems.com/labs/ar...VIDIA-GeForce- GPU-Performance-899/ -- Regards, Savageduck |
#114
|
|||
|
|||
Travel without a camera
In article , PAS wrote:
another thing you may not realize (and part of the user experience) is that when you first boot the imac, as part of the setup process, it will ask if you want to migrate from an existing computer or set it up as new. you can choose your windows computer as a source, and it will copy all of your email (and isp settings), browser bookmarks, contacts, photos, etc. it's not anywhere near as complete as a mac migration (which includes everything), but it's the best it can do. https://support.apple.com/en-us/HT204087 The migration from Windows to the Mac may be as complete as a Mac migration but t doesn't sound too shabby. How long have they had that feature? for mac os (then called os x), since panther (10.3) in 2003, and for windows, since lion (10.7) in 2011. |
#115
|
|||
|
|||
Travel without a camera
On Wednesday, June 21, 2017 at 2:19:08 PM UTC-4, PAS wrote:
... Here's the story about moving to TN: it was my wife's idea to go to TN. We first checked out Murfreesboro, about 1/2 hour from Nashville. She wasn't impressed - too flat for her. Strange since Long Island is flat. Six months after that visit I suggested we go to the Johnson City area, hardly flat there at all. She loved the area but the homes in the areas that caught her fancy were very high-priced and far too large for us. I found a home online in an area and asked her if she wanted to see it. It's an ugly green color and that turned her off completely. She was a bit frustrated with the whole process too so that added to it. She was resigned to leaving without finding anything. The next morning she contacted a real estate agent and while we were eating lunch, the agent called and said there is a home she might be interested in and gave her the info. It was the green home. I checked and found that we were only eight miles from there and convinced her to at least go have a look. Off we went and as soon as we drove into the subdivision, she was in love with the place. We went to the house and found there were two lots available next to it. We decided to buy one and have a house built there. Next was the task of finding a home design we liked. She couldn't find anything she liked. We're accustomed to going into model homes and being able to choose. Not the case here. A year later and she could not find anything she liked. We met with the builder this past April and, after a discussion, he said he had a home design she might like. He showed her the floor plan and she liked it a lot. What floor plan is that? The same one as the ugly green house. Because heaven forbid one ever repaint a house! g But seriously, some people do find it difficult to visualize "what ifs", and can be messed up by what someone else may find to be trivial (such as house or wall colors). Homes in this subdivision range from the $300,000s to over $4,000,000. There's a short video of the subdivision he http://oldislandcommunity.com/ FWIW, I've done some similar research in that same general region of TN and found some interesting things to keep track of. For example, what the monthly owner fees are for a lake or golf community ... I recently browsed one that initially looked pretty promising - only to find that it had a $1000 monthly fee because of said private lake. Another consideration is that stuff that looks to be a pretty good deal by our local standards ... such as new construction of a McMansion on a ~.33 lot for 'only' $300K ... is probably out of line economically with the region: your resale market is probably other Northern retirees to the region who want to be in a now-old subdivision rather than the next new one that's being built on YA plowed-under farm. And if one strikes out on one's own (either an existing, or to buy land & custom build): - within city limits or not? (pros/cons) - What's the neighborhood? Who are the neighbors? (four trucks in the front yard?) - What utilities are/aren't available? (do you really want to build/maintain a well, septic?) (is there phone/internet/CATV even available?) - How long of a drive to town for gas/groceries/anything? (isolation vs convenience, particularly when getting older) Depending on what one is shopping for ... and willing to take on, there's some interesting stuff. For example, here's a pretty extreme example of an inexpensive property: Its a 3BR/2BA on a third of an acre; here's the listing: https://www.realestate.com/586-monroe-st-madisonville-tn-37354--41899125 Notice anything missing from this listing? Like how they forgot to provide any pics of this $11K house? Well, here's the Google Street View: /data=!3m6!1e1!3m4!1shqTUx-kCJ3JxTmPAgJASOQ!2e0!7i3328!8i1664 Sunken lot probably means periodic flooding and 1940 stick built probably means termite damage. IMO, this is a knock-down followed by fill to bring it up from below grade (so it won't flood) before decided what to do (such as to drop in a doublewide or whatever), and this consideration alone goes pretty far to understanding why they're asking only $11K. And on the 'Location, Location, Location' mantra, a few years ago, the Mrs showed me a listing after I had casually tossed out some preliminary requirements ... was something like "well, a 3BR/2BA Ranch, 1970 or newer, with 2 car attached garage, plus I want another 2 car (detached okay) for a couple of summer toy cars. Oh, and since its TVA Lake Country, I'll have to take up powerboating, so a boat shed/barn for that too. And sure, some land for a big garden...". Well, it couldn't have been 15 minutes later when I was given the 'How about this place?' on her iPad. 3BR/2BA? Check. 1970+ Ranch? Check. With 2+2 garages? Check...and the boat shed too! On ~9 acres .. for something like $250K. The catch was that it was ~5 miles of narrow winding country road to the 'main' road, and then it was another ~5 miles to town, so it was really out in the boonies. Oh, and much of its acreage had been cleared, so the weekly chore list would include ~7 acres of lawn mowing. I've spent enough time already in Eastern TN in the summer to know just how miserably hot & humid that chore would end up being ... no thanks! -hh |
#116
|
|||
|
|||
Travel without a camera
On 6/21/2017 10:42 AM, PAS wrote:
On 6/21/2017 10:02 AM, Savageduck wrote: On Jun 21, 2017, PAS wrote (in article ): On 6/20/2017 4:49 PM, Savageduck wrote: On Jun 20, 2017, nospam wrote (in ) : In , wrote: Snip Take out corporate sales and the numbers are still overwhelmingly in favor of Windows PCs and you know that. they aren't. look around. there are ****loads of macs. Why not just take a poll among the usual suspects in this room? As best I can recall the Mac users here are Alan Browne, Davoud, David B., Sandman, Whisky-Dave, you, and me. That is 7 confirmed, there might be a few more. Confirmed Windows users are Eric, PeterN, Tony Cooper, PAS, Mayayana, Bill W, Noons, David Taylor, and probably at least 5 more for around 13. Then there is Floyd who has no time for Windows, or MacOS, along with the other Linux devotees. We've got two desktops. One is an older HP in our guestroom that was bought when Windows Vista was released. I have Windows 10 on it now and it's still going strong. It gets used mostly by guests when they stay over. My desktop is one I built about 1 1/2 years ago. We have three laptops in the house. Mine is seldom used. My wife refuses to give up her tired old laptop for the new one I bought her over a year ago. But very soon she'll have no choice I plan on getting an iMac within the next year to see how I like it. Whichever Mac you choose I recommend at least 16 GB RAM. The transition might annoy you a little bit at first as some things will be different due to a lifetime of habits. That said, I believe you will find your new experience with a Mac surprisingly pleasant. Just be patient through the adaption period, keep an open mind, and remember you can always run Windows 10 on your new Mac, either with a Bootcamp partition or VM. My recommendation would be to use VMware Fusion. I will definitely spec it with 16GB of RAM, no less. Yes, I've developed quite a lot of habits using Windows for over 20 years. I'm sure the transition won't be too frustrating. I'll have all the time I need to get adapted - I'm retiring in 2 1/2 weeks. After I retired, I started how I ever had time for work. -- PeterN |
#117
|
|||
|
|||
Travel without a camera
On 6/21/2017 11:45 AM, PAS wrote:
On 6/21/2017 11:27 AM, nospam wrote: In article , PAS wrote: I plan on getting an iMac within the next year to see how I like it. Whichever Mac you choose I recommend at least 16 GB RAM. The transition might annoy you a little bit at first as some things will be different due to a lifetime of habits. That said, I believe you will find your new experience with a Mac surprisingly pleasant. Just be patient through the adaption period, keep an open mind, and remember you can always run Windows 10 on your new Mac, either with a Bootcamp partition or VM. My recommendation would be to use VMware Fusion. I will definitely spec it with 16GB of RAM, no less. Yes, I've developed quite a lot of habits using Windows for over 20 years. I'm sure the transition won't be too frustrating. I'll have all the time I need to get adapted - I'm retiring in 2 1/2 weeks. what are you going to use the mac for? don't assume that memory requirements of windows are the same as macos, particularly when the mac has *extremely* fast ssd. Most likely I'll use it for Photoshop and other imaging apps. Maybe I'll like it enough to use it for everything else too Let's assume we have a Mac and a Windows PC with similar hardware specs. How would the memory performance be different and why would it be different? I'm ignorant as to how a Mac OS utilizes hardware. If we take an SSD out of the equation for both comparable systems, is the performance any different? Reason not to completely trust the nospam analysis. My younger daughter, who is a well known and respected graphics professional Her office machine is a Mac, for her home use she prefers a PC. Other corporate executives in her company also prefer PCs. -- PeterN |
#118
|
|||
|
|||
Travel without a camera
On 6/21/2017 11:49 AM, Savageduck wrote:
On Jun 21, 2017, PAS wrote (in article ): On 6/21/2017 10:53 AM, Savageduck wrote: On Jun 21, 2017, PAS wrote (in article ): On 6/21/2017 10:02 AM, Savageduck wrote: On Jun 21, 2017, PAS wrote (in article ): On 6/20/2017 4:49 PM, Savageduck wrote: On Jun 20, 2017, nospam wrote (in ) : In , wrote: Snip Take out corporate sales and the numbers are still overwhelmingly in favor of Windows PCs and you know that. they aren't. look around. there are ****loads of macs. Why not just take a poll among the usual suspects in this room? As best I can recall the Mac users here are Alan Browne, Davoud, David B., Sandman, Whisky-Dave, you, and me. That is 7 confirmed, there might be a few more. Confirmed Windows users are Eric, PeterN, Tony Cooper, PAS, Mayayana, Bill W, Noons, David Taylor, and probably at least 5 more for around 13. Then there is Floyd who has no time for Windows, or MacOS, along with the other Linux devotees. We've got two desktops. One is an older HP in our guestroom that was bought when Windows Vista was released. I have Windows 10 on it now and it's still going strong. It gets used mostly by guests when they stay over. My desktop is one I built about 1 1/2 years ago. We have three laptops in the house. Mine is seldom used. My wife refuses to give up her tired old laptop for the new one I bought her over a year ago. But very soon she'll have no choice I plan on getting an iMac within the next year to see how I like it. Whichever Mac you choose I recommend at least 16 GB RAM. The transition might annoy you a little bit at first as some things will be different due to a lifetime of habits. That said, I believe you will find your new experience with a Mac surprisingly pleasant. Just be patient through the adaption period, keep an open mind, and remember you can always run Windows 10 on your new Mac, either with a Bootcamp partition or VM. My recommendation would be to use VMware Fusion. I will definitely spec it with 16GB of RAM, no less. Yes, I've developed quite a lot of habits using Windows for over 20 years. I'm sure the transition won't be too frustrating. I'll have all the time I need to get adapted - I'm retiring in 2 1/2 weeks. Let me be the first to welcome you to the Great Army of the Gainfully Unemployed. Thank you! Lots of big changes on our lives. Retiring, selling or home, packing up, leaving friends, heading to a new place for a new life. IIRC your son is with NYPD. I guess he has his home somewhere in the NYC area. Where in NY is your home, and where are you planning to move? I made the move from Upstate NY to California over 40 years ago and I am quite content here on the California Central Coast in San Luis Obispo County. My only issue this week has been the current heat wave. Since Thursday last week we have had temperatures ranging from 103ºF-106ºF (39.4ºC-40.5ºC) with no relief in the offing until the weekend when we should have a cold snap in the mid 90’s. Even if he switches to the dark side, PAS is still a good guy, who I have the pleasure of personally knowing. (Even if he has never gone to Mcnulty's. -- PeterN |
#119
|
|||
|
|||
Travel without a camera
On Wed, 21 Jun 2017 13:21:27 -0700, Savageduck
wrote: On Jun 21, 2017, hh wrote (in ): On Wednesday, June 21, 2017 at 11:27:19 AM UTC-4, nospam wrote: -hh wrote: nospam wrote: much of what photoshop does is i/o bound, which can't be parallelized. But if it was I/O bound, then moving a computation from CPU to GPU card & back wouldn't result in a performance gain. many times it doesn't, and there is also the overhead of moving it back and forth too. If PS is as I/O bound as you claimed, then in all of those instances, it would never pay to offload to a GPU because the delay in waiting for the bandwidth to send/fetch would exceed the parallelism speed-up. photoshop uses the gpu only when it will accelerate a given operation. if that operation would be faster on the cpu, then that's where it will run. Which as per your "much of the time" bandwidth claim, will be rare-to-never and as such, a moot point. Overall, Adobe's not really a particularly sophisticated user of GPU potential nonsense. Oh, and you were doing so well! The cited Adobe webpage made it pretty clear that their software is limited to only using one GPU card at a time. most people have one gpu card. it's optimized for the common case. plus, a second gpu isn't necessarily better. again, not everything benefits from one gpu, let alone two. lots of apps don't use multiple gpus. adobe isn't unique in that regard. Irrelevant & a distraction attempt. nope. it's reality. You're trying to deflect from admitting that you were wrong in disagreeing with my statement that Adobe is not a particularly sophisticated user of GPU potential. of course i disagree. that statement is flat out absurd. (and you conveniently snipped that text) As I had said: "Contrasting that limitation on sophistication, I can personally recall working on a project with image analysis that used multiple discrete GPU cards (IIRC, ~8) ... way back in 2004. *that* is irrelevant (and why it was snipped). what you did with some random analysis app in 2004 has absolutely nothing whatsoever to do with photoshop. Incorrect, because both are processing images, and have a design goal of leveraging PC hardware to do it quickly. Indeed, both leveraged GPU cards for parallelism ... but Adobe only went to the level of sophistication of a single card. they are two totally different apps, with two totally different code bases and two totally different goals. Different apps? Sure. Different code bases? Of course. Different goals? You can't claim that, because you don't actually know what specifically this other product did. FYI, what it did can be replicated in PS, but was dog slow. you're also oblivious to the fact that photoshop was using multiple processors in the 1990s, ten years before you were working on that project. there wasn't much gpu acceleration back then, but there were dsp cards that dramatically accelerated photoshop. Oh, I'm quite aware of PS on Dual-CPU Macs in that era ... but let's also keep in mind that in this period, the CPUs were still often also "single core". Even so, did Adobe ever progress to the point of being able to be paralleled across 6 or 8 CPUs? Now you may wish to claim that Adobe is "sophisticated", absolutely. but compared to how others have done distributed graphical processing across multiple discrete GPU cards, Adobe's current status is over a decade behind the start of the art." nonsense. complete utter nonsense. You're entitled to your own opinion, but not your own facts. And now you're trying to rationalize why Adobe only supports but a single GPU card with a "most people" statement: that's a reasonably good case for business optimization - - but it simply does not support a claim of a superior level of technological sophistication. what matters is whether the end result is faster, not how many gpus are used. Copying what I've already said to try to claim it for yourself, we see. using two gpus simply because two of them must be twice as good as one is crazy. it's a bigger number so it must be better! doesn't work that way. But your claim was GPUs as a metric for _sophistication_. the same applies to multi-core. some (ignorant) people bitch about how photoshop doesn't always use all available cores. the answer is because sometimes it's faster to do a particular task on 1 or 2 cores than it is on 4 or 8 cores. photoshop is *incredibly* optimized, even tuned to specific versions of processors. to claim that adobe is not sophisticated is absurd. Adobe does a pretty good job at optimization ... but within the box that they've chosen to define for themselves. That has included the choice to only support a single GPU, presumably because they know that this higher level of sophistication is more difficult to successfully code, and would be further towards the diminishing returns in the marketplace where there's fewer customers who need that bit more to be willing to pay for how much it costs. Which, once again, is a business decision - - based on avoiding the cost of technological sophistication that may not have a large enough customer base to be willing to pay how much it would cost for them to develop, test, and deploy. -hh You might find this interesting reading: https://helpx.adobe.com/photoshop/kb/photoshop-cc-gpu-card-faq.html http://www.nvidia.com/object/adobe-photoshop-cc.html https://www.pugetsystems.com/labs/articles/Photoshop-CC-2017-NVIDIA-GeForce-GPU-Performance-899/ That last link is incorrect in saying "The one downside to these cards is that they do not support 10-bit displays ...". 10 bit displays are supported by the 1070 and up. -- Regards, Eric Stevens |
#120
|
|||
|
|||
Travel without a camera
On Wed, 21 Jun 2017 04:56:19 -0700 (PDT), Whisky-dave
wrote: On Wednesday, 21 June 2017 05:04:13 UTC+1, Eric Stevens wrote: As I have previously explained, I have no ROI so that wasn't one of my considerations. The plain fact of the matter was that I had an aging computer and Adobe was rejecting my GPU. when you brought the computer you didn't future proof it by buying a graphics card that adobe would support in the future, I thought that was your aim of future proofing. It lasted me 6 years before I ran into that. Who is to complain? True. But Adobe is making increasing use of the GPU and I would be surprised if this trend does not coninue. As with games one would hope, but soemtimes hardware is the real key which is why they keep coming out with faster & better graphics cards than adobe update their software. And thus, more the reason to keep on considering CS5/CS6 characteristics such as via Digital Lloyd's "macperformance guide" pages, because these are still the present situation for some: they're not doing PS as a vocation, so they've stuck with the older versions instead of paying into CC's 'rental' business model. I'm the reverse. I could never bring myself to pay Adobe's horrendous up front cost (plus upgrades) but I can tolerate the monthly rental. I can understand that and that would be the way I'd go, but in the past I couldn't afford photoshop, but I found other methods of aquiring it. It's costing me less than all the things I used to buy in the attempt to paying for CS5 or CS6. Yes I think the CC idea works well as a subscription idea and as they have it at a reasonable cost and includeds lightroom. I doubt that I will go back. -- Regards, Eric Stevens |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Travel Camera | Trundle | Digital Photography | 14 | November 24th 14 08:00 PM |
Ideal travel camera... | AKT | Digital Photography | 5 | November 18th 07 08:11 PM |
Air Travel with LF Camera | Ron Gans | Large Format Photography Equipment | 17 | April 10th 07 10:34 PM |
Need New Travel Camera | rhonda | Digital Photography | 4 | August 4th 06 04:56 PM |
Digital travel camera | [email protected] | Film & Labs | 0 | January 29th 04 05:33 AM |