PhotoBanter.com

PhotoBanter.com (http://www.photobanter.com/index.php)
-   Digital Photography (http://www.photobanter.com/forumdisplay.php?f=5)
-   -   Where I keep my spare cats. (http://www.photobanter.com/showthread.php?t=130367)

Diesel August 9th 17 04:11 AM

Where I keep my spare cats.
 
nospam
Tue, 08 Aug 2017
20:33:00 GMT in rec.photo.digital, wrote:

it's a very good comparison and exactly why unix apps are second
rate. there's no incentive to fix any of the problems.


Horse ****.

only for mindless sheep does popularity matter.


That would be the majority of the general public.

that's why apple bashers keep citing the 90% market share as if
it's an indication of quality. it's not.


I'm not an Apple basher, either.

as for beta/vhs, tv stations used betacam (the pro version of
betamax), not vhs. they chose that because it was the only system
that did what they needed, not because it was popular, so your
example doesn't even prove what you thought it did.


Talk about moving goal posts. For the end user, betamax lost out to
vhs, and, it wasn't because vhs was better technology or video
quality wise. TV stations are not the typical end user, and, they
usually have much nicer gear; specialized gear for the work they do.


prices are competitive and macs are the *only* platform
that can run mac, windows *and* unix.

http://emulators.com/

emulation means the host system *can't* run it, it has to
emulate it. you just proved my point.

Actually, I discredited what you wrote.

no you haven't. you don't even *understand* what i wrote.


Yes I do. You claimed that macs were the only ones that could run
mac, windows and unix, but, that's not true.


it is true.


No, it isn't. As the url I provided shows.

running mac os on a hackintosh is a violation of the eula, plus a
hackintosh doesn't do most of the things a genuine mac can do,
even if violating the eula wasn't a concern.


what exactly is a hackintosh? You keep spouting that mac does things
a PC cannot do, aside from running some very specific apple software
or including some very specific Apple hardware; what specifically can
a mac do that a PC cannot? This is your chance to try and sell me on
a mac, btw. Or, perhaps another individual reading our 'discussion'
if you want to stretch the words meaning that far. I'll give you some
leeway on that.

The iphone battery typically lasts less than two years. And,
that's if you take good care of it.


complete utter nonsense. not even close to reality.

http://www.cio.com/article/2419524/c...gy/how-to-know
-if-your-iph one-battery-is-on-death-row.html


bogus article.


I seriously doubt that.

http://www.macworld.com/article/1058.../iphonebattery.
html


Oh, that wouldn't have any bias towards it. None at all. /sarcasm.

the reality is that an iphone battery lasts many *years* with
minimal degradation.


That isn't reality.

I've been in the IT industry professionally for several decades
with the certifications to back it up. As well as having nearly a
decades worth of electrical industry experience under my belt.
Certifications to back that up, too.


big deal. certs don't mean much of anything, other than being able
to memorize answers and pass a test.


If you think all certs are setup that way, then I'd question what/if
any you actually have yourself...

those tests are so ridiculously easy that anyone with a room
temperature iq can pass one.


So which one(s) do you possess? I'll try not to laugh too much if you
evade my question or respond with none and provide some silly reason
to justify having none.

most of the it people i've known over the years aren't all that
smart. the few who know their **** don't stay there, they move up
to engineering in short order. they took an it job to get a foot
in the door.


I'm not like the IT people you've known over the years. Nor are the
majority of my peers. IT has become such a generalized term, it's
almost insulting to even use it, even when describing yourself.
Computer science doesn't have the same 'sales' potential either. And,
engineer sadly, is approaching the same.

Fact is, your claim that AV companies created viruses to boost
sales of their own products is complete and utter bull****. They
don't need to do that, they've never needed to do that and it
wouldn't give them ANY real advantage for doing it. Once
something is ITW (and it needs to be for them to take interest),
collecting a viable sample for any company is an easy process. It
was part of my job duties when I worked for malwarebytes in fact.
I could easily collect several THOUSAND fresh 0day samples in a
few hours, with very little effort.


it's not bull****. anti-malware companies have a vested interest
for malware to exist. without malware, there's no need for their
products.


That's a crock of horse ****, too. There's more than enough malware
to last for years from now, without a single actual new one being
created from scratch. AV companies would LOVE it if they could re
dedicate resources to more productive tasks rather than reverse
samples on a daily basis; automation only gets you so far in that
aspect.

they need people to feel threatened by malware so that their sales
continue.


That's why almost every player has free versions of their products
that use the same detection/removal engine and shared definitions as
the commercial counterparts, right? Do you even believe the bull****
you're trying to peddle here yourself?


While you may infact be an HLL programmer and possibly a decent
one at that, It doesn't mean you actually have a clue how your
program works at the low level. Or, what you're giving about
yourself away as you churn out program after program.


yet another feeble attempt at an insult and you demonstrate just
how little you actually know about software development at all
levels.


I made no effort to insult you. I was on point though with my
assumption concerning your programming expertise though, wasn't I.
You *are* an HLL level programmer, correct?

it means bump up the clock speed, increase the
capacity/speed of the ssd, etc. of *existing*
configurations.

Are you going to overclock the existing cpu to bring up the
clock speed, or, changeout the cpu for one that runs at a
higher speed, natively? I'm not a big fan of overclocking
myself. Or, did you throttle the cpu clock speed back and
decide to bring it to the rate it was originally designed to
run and increase cost to consumer for that 'higher' clock
speed? The latter seems shady to me. Almost dishonest.

nobody said anything about overclocking.

yet another thing about which you know nothing.


Okay. I'll play along with this. Explain how you bumped up the
clock speed if you didn't do anything I described?


you haven't a clue.


Oh, yes, I most certainly do. You've entered a discussion with me
where you have no chance. You might as well have tried to bluff your
knowledge concerning assembler and/or machine code at this point. The
result for you would be the same. a thorough arse kicking by me.

nowhere did anyone say the end user bumped the specs.


I said nothing about the end user bumping anything. You claimed the
vendor (apple) bumped the clock spec. If they didn't do any of the
aforementioned things, what did they do to achieve it?



on a mac, there are no compromises when adding a 10 gig-e nic.
none. it will run at full 10g speeds.


Only if you have the required supporting hardware connected. And
even then, it might still be restricted to local network use at
full speed, because your outbound feed to the net probably isn't
going to be cruising that fast in all areas. Atleast, not here in
the states.


for someone who brags about their certs, you know very little.


Which certs do you hold? It's a valid question at this point.

10 gig is not for a connection to the internet, but rather for the
internal lan.


I believe I covered that with 'local network' You know what LAN means
right? LOCAL AREA NETWORK. Doh! 10gig is possible via fibre, btw.

And you *still* require 10gigabit friendly switchgear to do it, and
every device you expect to run that fast must also support it. As
well as the cabling (shudder, yes I said it, the cabling); something
you obviously snub your nose at pulling.

And incidently, depending upon your internet based connection, it's
more than possible to pull 10gigabits from it.

https://en.wikipedia.org/wiki/Fiber-optic_communication

a single desktop/laptop can *easily* saturate a gigabit link
without much effort, making gigabit the bottleneck.


If you're moving some rather large files around using multiple
computers, sure. A single machine isn't that likely to saturate the
entire 1gigabit switchgear, though. A NAS system oth, is another
beastie outright and that's not exactly a fair comparison, either.

And most home users are not going to move enough data from machine to
machine at one time to saturate a gigabit based LAN. Many small
businesses depending on the type of business won't even do that.
Digital movie production, serious photographic work, maybe. And the
latter is pushing it. I've setup audio recording studios that
wouldn't tax 1gigabit based LAN. And that's working with RAW audio
files.

removing that bottleneck is generally a wise investment, one which
will pay for itself fairly quickly in increased productivity. it
might not matter that much for a home user, but it would for smb.


Of course it won't for a home user. A home user isn't going to have
multiple computers taxing a 1gigabit link. It may matter for an smb,
depending on what the network is actually being used for and how many
users are actually using it at one time as well as file server/work
station configurations. However, to claim automatically that it would
for smb is a foolish claim to make that isn't supported by facts.

the fact is that existing macs can easily support 10 gig, whereas
most pcs cannot. pc users will need a new computer. mac users
won't.


Actually, some pcs have had 10gigabit cards built into the mainboard
for several years now. Gigabyte and supermicro are two such vendors.
It's not a mac only 'feature' PC users have choices, where as with
Apple, you get what Apple offers and only what Apple offers. It's
proprietary hardware, after all. You can't pick and choose the
mainboard you want to use, or the feature set based on the mainboard
selection because that's not an option Apple offers you. It's Apple
or nothing.



for a mac (laptop or desktop, doesn't matter), connect it
with a cable. no need to even open the computer.

Well, for the PC, if it doesn't already support it, you can
easily add it:

http://www.tigerdirect.com/applicati...ls/item-detail
s.a sp?EdpNo=9745843

that won't fit in any laptop from any manufacturer nor will it
fit into many desktops, including the microsoft surface studio.


Funny, as it's not intended to be installed in a laptop or a tiny
wannabe desktop. it's intended for actual full size desktop/tower
computers that typically do have the room, as they were built
with upgrading in mind.


you said popularity is what matters and what's popular are not
full size desktop/towers, but rather laptops.


And as I said, some PC based systems have had 10gigabit built in for
several years now. The add on card I mentioned is for the older or
less cost upfront machines that do not have it presently, but, for
whatever reason, the user may wish to add it. Users that have serious
LANs that actually do have a reason/need for it, that is. LAN setups
with one or more dedicated and serious storage capable NAS devices,
primarily. Not your typical home users.

furthermore, few people actually upgrade anyway.


That's nothing more than your own personal opinion. One of which I do
not share, because the experience I have in the field doesn't support
your opinion. Apple users likely don't do much in the way of hardware
upgrades because they are rather limited in what upgrades they can
actually do. PC users are a different animal.

--
https://tekrider.net/pages/david-brooks-stalker.php
http://picpaste.com/m9WZ0KrA.jpg - David and Trisha
http://picpaste.com/U5np7XvN.jpg - How to contact David and Trisha
David and Trisha love meeting new interests for sexual exploration.
If this sounds like something you might be interested in knowing more
about or participating in - feel free to reach around and contact
them. As David once put it, "We might be older, but we're not dead."


What you perceive, exists.

-hh August 9th 17 11:00 PM

Where I keep my spare cats.
 
On Tuesday, August 8, 2017 at 11:15:54 PM UTC-4, Diesel wrote:
nospam
Tue, 08 Aug 2017
20:33:00 GMT in rec.photo.digital, wrote:

it's a very good comparison and exactly why unix apps are second
rate. there's no incentive to fix any of the problems.


Horse ****.


Unfortunately, the Linux (+Unix) desktop products fail to
demonstrate otherwise...but I digress.



only for mindless sheep does popularity matter.


That would be the majority of the general public.


Except that both points are misguided...but I digress.


as for beta/vhs, tv stations used betacam (the pro version of
betamax), not vhs. they chose that because it was the only system
that did what they needed, not because it was popular, so your
example doesn't even prove what you thought it did.


Talk about moving goal posts. For the end user, betamax lost out to
vhs, and, it wasn't because vhs was better technology or video
quality wise. TV stations are not the typical end user, and, they
usually have much nicer gear; specialized gear for the work they do.


Except that Beta did die, which illustrates that VHS was determined
by the competitive marketplace to have been 'better', even if you're
not personally able to ascertain the reason why...YA digression.


prices are competitive and macs are the *only* platform
that can run mac, windows *and* unix.

http://emulators.com/

emulation means the host system *can't* run it, it has to
emulate it. you just proved my point.

Actually, I discredited what you wrote.

no you haven't. you don't even *understand* what i wrote.

Yes I do. You claimed that macs were the only ones that could run
mac, windows and unix, but, that's not true.


it is true.


No, it isn't. As the url I provided shows.


Just because one *can* do something, doesn't mean that it is in
full compliance with all legal obligations. Gosh, another digression.


The iphone battery typically lasts less than two years. And,
that's if you take good care of it.


complete utter nonsense. not even close to reality.

http://www.cio.com/article/2419524/c...gy/how-to-know
-if-your-iph one-battery-is-on-death-row.html


bogus article.


I seriously doubt that.

http://www.macworld.com/article/1058.../iphonebattery.
html


Oh, that wouldn't have any bias towards it. None at all. /sarcasm.


From a design engineering standpoint, lifespan boils down to what the
discharge limit parameters are set at: it is very easy to be tempted
to cheat and dip too far down, which kills a Li-Ion in relatively
few cycles but lets you claim a higher battery rating for marketing
and sales to sell product. Thus, if your experience is that the
stuff only barely lasts two years, you're either really beating on
it with multiple full cycles daily, or you've bought cheap crap that
was deliberately under-engineered in order to sell better.

Meantime, Apple's apparently following in their design space choice
for iOS power specs which are pretty conservative: it makes a two
year life expected at ~95% percentile, and the practical battery
capacity EOL's start to show up at the 4-5 year mark for the more
typical customer's duty cycle (50th percentile; phones tend to be
more cycled per unit time than tablets). YA digression.


the reality is that an iphone battery lasts many *years* with
minimal degradation.


That isn't reality.


A PSA "FYI" digression:
Next time you see an iOS with the old style (pre-lightning) plug,
be aware that those devices are all now more than 5 years old.



I'm not like the IT people you've known over the years. Nor are the
majority of my peers.


That's because your day job is still as an Electrician's helper,
pulling wire...right? But we digress even further downhill.


they need people to feel threatened by malware so that their sales
continue.


That's why almost every player has free versions of their products
that use the same detection/removal engine and shared definitions as
the commercial counterparts, right? Do you even believe the bull****
you're trying to peddle here yourself?


How many IT certifications are there in Marketing? /S




a single desktop/laptop can *easily* saturate a gigabit link
without much effort, making gigabit the bottleneck.


If you're moving some rather large files around using multiple
computers, sure.


Even the latency will clobber you.

A single machine isn't that likely to saturate the
entire 1gigabit switchgear, though.


Gigabit Ethernet = 125 MB/sec
SATA-1 = 150 MB/sec

Oops!


And most home users are not going to move enough data from machine
to machine at one time to saturate a gigabit based LAN.


Where "most home users" is being defined as individuals who
employ DAS instead of NAS because a Gigabit Ethernet connected
NAS is more expensive and slower...right?

Many small businesses depending on the type of business won't
even do that.


Of course not, because the local data storage has been regularly
running on SATA-3 (600 MB/sec) for years now, which means that
Gigabit Ethernet has been a huge bottleneck for just as long.


Digital movie production, serious photographic work, maybe. And the
latter is pushing it. I've setup audio recording studios that
wouldn't tax 1gigabit based LAN. And that's working with RAW audio
files.


Just because you've anecdotally not had bandwidth problems with GbE
doesn't mean that you can try to look down your nose at others.
For example, a circa 2010 technology "600x" CF card for still
photography was spec'ed at a 90 MB/sec read, which effectively pushed
Gigabit Ethernet to its practical limit...and the new stuff today
(such as Cfast) has specs of 510 MB/s reads, which is 4x the max
theoretical for GbE (and is already ~40% of the max capacity of 10GbE).


removing that bottleneck is generally a wise investment, one which
will pay for itself fairly quickly in increased productivity. it
might not matter that much for a home user, but it would for smb.


Of course it won't for a home user. A home user isn't going to have
multiple computers taxing a 1gigabit link.


Depends on the home user and their use case ... a single higher end
photo hobbyist can saturate a GbE link on but a single machine.
Contemplate making a 200GB transfer on GbE ... even if it was able
to run at theoretical max, its still a quick half hour time suck.


furthermore, few people actually upgrade anyway.


That's nothing more than your own personal opinion. One of which I do
not share, because the experience I have in the field doesn't support
your opinion. Apple users likely don't do much in the way of hardware
upgrades because they are rather limited in what upgrades they can
actually do. PC users are a different animal.


There's been industry studies which do indeed show that hardware
upgrades have become the exception, not the rule ... this isn't
the 1990s anymore.

And sure, IT-centric hobbyists tend to do more DIY upgrades,
although much of this is driven by economics: they're not able
to write off their expenses like a business does, even when
they had the free cash to spend on it, so there's different
motivations to their behaviors. This tendency tends to be strongest
amongst the Linux "l33t" stereotypes, which also tend to be those
who have the tightest discretionary budget too, so it becomes a
self-fulfilling positive feedback loop.


-hh

Eric Stevens August 11th 17 05:50 AM

Where I keep my spare cats.
 
On Tue, 08 Aug 2017 16:33:00 -0400, nospam
wrote:

While you may infact be an HLL programmer and possibly a decent one
at that, It doesn't mean you actually have a clue how your program
works at the low level. Or, what you're giving about yourself away
as you churn out program after program.


yet another feeble attempt at an insult and you demonstrate just how
little you actually know about software development at all levels.

--

Regards,

Eric Stevens

nospam August 12th 17 08:38 PM

Where I keep my spare cats.
 
In article
XnsA7CBEE17927AAHT1@dr50n7Lg2Q3UT128P02Mrd6V9H9D2 cxu7y3AkI70C48M5kGFxO.
4m50YcRx, Diesel wrote:

it's a very good comparison and exactly why unix apps are second
rate. there's no incentive to fix any of the problems.


Horse ****.


it's true.

state of the art software is almost entirely mac/windows and more
recently, ios/android, because there's no financial incentive to bother
with anything else.

there's the occasional exception, but it's very rare. the best software
engineers do *not* work for free.

only for mindless sheep does popularity matter.


That would be the majority of the general public.

that's why apple bashers keep citing the 90% market share as if
it's an indication of quality. it's not.


I'm not an Apple basher, either.


bull****.

any time anyone says anything anti-apple, you cheer it, even when it's
completely bogus.

any time anyone says anything positive about apple, you immediately
dismiss it as false.

as for beta/vhs, tv stations used betacam (the pro version of
betamax), not vhs. they chose that because it was the only system
that did what they needed, not because it was popular, so your
example doesn't even prove what you thought it did.


Talk about moving goal posts.


which is why i pointed out that you did exactly that by mentioning
beta/vhs.

For the end user, betamax lost out to
vhs, and, it wasn't because vhs was better technology or video
quality wise. TV stations are not the typical end user, and, they
usually have much nicer gear; specialized gear for the work they do.


in other words, tv stations bought what fits their needs, not what was
popular.

did you have a point? nope.
did you just contradict yourself again? yep.


prices are competitive and macs are the *only* platform
that can run mac, windows *and* unix.

http://emulators.com/

emulation means the host system *can't* run it, it has to
emulate it. you just proved my point.

Actually, I discredited what you wrote.

no you haven't. you don't even *understand* what i wrote.

Yes I do. You claimed that macs were the only ones that could run
mac, windows and unix, but, that's not true.


it is true.


No, it isn't. As the url I provided shows.


wrong again.

macs can run mac/windows/unix apps *natively*. no emulation required.
no other system can do that.

not only that, but emulation won't work for a lot of apps, not just
mac, even if the performance hit was not an issue.

running mac os on a hackintosh is a violation of the eula, plus a
hackintosh doesn't do most of the things a genuine mac can do,
even if violating the eula wasn't a concern.


what exactly is a hackintosh?


it's a non-mac that runs mac os, except that it requires jumping
through a lot of hoops to get it to work and more hoops to keep it
working in the event updates undo previous hoops. also, quite a bit of
mac os won't work at all because the hardware it requires is not
present.

You keep spouting that mac does things
a PC cannot do, aside from running some very specific apple software
or including some very specific Apple hardware; what specifically can
a mac do that a PC cannot? This is your chance to try and sell me on
a mac, btw. Or, perhaps another individual reading our 'discussion'
if you want to stretch the words meaning that far. I'll give you some
leeway on that.


i already listed a bunch of things.

you didn't understand any of them then and i haven't seen any
indication that you will now.

you claimed a pc could emulate most of them (which is amusing in
itself). i challenged you to cite specific examples, but you (wisely)
chose to not embarrass yourself any further.

i'm not interested in selling you or anyone else on a mac. i don't give
a flying **** what your or anyone else uses.

what matters is that people make an *informed* *decision* based on
*accurate* information, not myths and propaganda, so that they can
choose a product that best fits their needs, no matter who makes it.

no single product is best at everything.

The iphone battery typically lasts less than two years. And,
that's if you take good care of it.


complete utter nonsense. not even close to reality.

http://www.cio.com/article/2419524/c...gy/how-to-know
-if-your-iph one-battery-is-on-death-row.html


bogus article.


I seriously doubt that.


doubt it all you want, but it's completely bogus. it's bull****. you
can't see past your hate to realize t's bull****.

batteries do *not* fail in two years, whether it's apple, samsung or
some other company.

android devices, which use the same battery technology as apple does
(lithium ion or lithium polymer), often from the very same battery
manufacturer, last much longer than 2 years.

http://www.macworld.com/article/1058.../iphonebattery.
html


Oh, that wouldn't have any bias towards it. None at all. /sarcasm.


you only believe something if it's anti-apple.

jason is one of the most respected journalists in the industry and that
article is *accurate*.

he's also quoting apple, who are legally bound to *not* lie about their
products.

the reality is that an iphone battery lasts many *years* with
minimal degradation.


That isn't reality.


yes it is reality, and not just apple either.

in general, lithium polymer and lithium ion batteries are rated at 5
years with minimal degradation (80% capacity).

only a *defective* battery will fail in 2 years.

nothing is perfect and a tiny percentage of batteries will be defective
and may prematurely fail, but that's the *exception*, not the rule.

defective batteries will be replaced *for* *free* under warranty and in
many cases, outside of warranty. apple is well known for the latter,
other companies not so much.

nearly half of iphones currently in use (as of this past april) are at
least 3 years old:
http://zdnet3.cbsistatic.com/hub/i/r...015-46e7-b23b-
b1db6c59eb52/resize/770xauto/fbef1701d6f7cfaab3fff552d60fd517/2017-05-01
12-41-05.jpg

so much for your bull**** claim of it'll be dead in two years.

even at 80%, it's still very usable. an iphone 7+ is rated for 21 hours
talk time and 15 hours wifi use, so after 5 years at 80% capacity, it
would have roughly 17 hours talk time and 12 hours internet use. that's
still *very* good.

in fact, that's better than many android phones when new, some of which
can't make it through a single day (mostly on standby) without being
recharged.

complete non-issue, fabricated by a hater.



it means bump up the clock speed, increase the
capacity/speed of the ssd, etc. of *existing*
configurations.

Are you going to overclock the existing cpu to bring up the
clock speed, or, changeout the cpu for one that runs at a
higher speed, natively? I'm not a big fan of overclocking
myself. Or, did you throttle the cpu clock speed back and
decide to bring it to the rate it was originally designed to
run and increase cost to consumer for that 'higher' clock
speed? The latter seems shady to me. Almost dishonest.

nobody said anything about overclocking.

yet another thing about which you know nothing.

Okay. I'll play along with this. Explain how you bumped up the
clock speed if you didn't do anything I described?


you haven't a clue.


Oh, yes, I most certainly do. You've entered a discussion with me
where you have no chance. You might as well have tried to bluff your
knowledge concerning assembler and/or machine code at this point. The
result for you would be the same. a thorough arse kicking by me.


you're talking out your ass again. you have *no* clue what the term
speed bump means, a term that's commonly used in the industry. you
guessed wrong and now trying to save face.

you're full of **** and you ain't fooling *anyone*.

nowhere did anyone say the end user bumped the specs.


I said nothing about the end user bumping anything.


yes you did:
Are you going to overclock the existing cpu to bring up the
clock speed, or, changeout the cpu for one that runs at a
higher speed, natively? I'm not a big fan of overclocking



You claimed the
vendor (apple) bumped the clock spec. If they didn't do any of the
aforementioned things, what did they do to achieve it?


what apple did was release a *new* product with better specs than the
previous model, the same as many other companies do.

you're not involved in the industry to know commonly used terms and
completely fail at pretending to be.




a single desktop/laptop can *easily* saturate a gigabit link
without much effort, making gigabit the bottleneck.


If you're moving some rather large files around using multiple
computers, sure. A single machine isn't that likely to saturate the
entire 1gigabit switchgear, though. A NAS system oth, is another
beastie outright and that's not exactly a fair comparison, either.


a single mac can *easily* can saturate a gigabit link without any
effort whatsoever, *without* a nas.

nases are also *very* common, particularly for mac users. nothing
unfair about a nas, although not required in this example.

And most home users are not going to move enough data from machine to
machine at one time to saturate a gigabit based LAN. Many small
businesses depending on the type of business won't even do that.
Digital movie production, serious photographic work, maybe. And the
latter is pushing it. I've setup audio recording studios that
wouldn't tax 1gigabit based LAN. And that's working with RAW audio
files.


your knowledge and experience is incredibly limited.

those studios should hire someone who knows what they're doing.

removing that bottleneck is generally a wise investment, one which
will pay for itself fairly quickly in increased productivity. it
might not matter that much for a home user, but it would for smb.


Of course it won't for a home user. A home user isn't going to have
multiple computers taxing a 1gigabit link.


yes they will, and have for many years.

usb 3.0 is 5 gb/s.
usb 3.1 gen 2 is 10 gb/s.

that's already an issue.

recent macs can do 2-3 giga *bytes*/sec from ssd, which can saturate a
10 gig link, nevermind gigabit, which it does routinely.

https://apple.insidercdn.com/gallery/18862-18409-Extended-l.jpg


It may matter for an smb,
depending on what the network is actually being used for and how many
users are actually using it at one time as well as file server/work
station configurations. However, to claim automatically that it would
for smb is a foolish claim to make that isn't supported by facts.


it's well supported, even for a solo user, so it absolutely would for
smb.

the fact is that existing macs can easily support 10 gig, whereas
most pcs cannot. pc users will need a new computer. mac users
won't.


Actually, some pcs have had 10gigabit cards built into the mainboard
for several years now.


not at the consumer level and certainly not on laptops, due to cost,
space and thermal constraints.

macbooks under $1000 can use 10 gig network adapters at full bandwidth.

Users that have serious
LANs that actually do have a reason/need for it, that is. LAN setups
with one or more dedicated and serious storage capable NAS devices,
primarily. Not your typical home users.


wrong.

there's no need for a 'serious lan' or 'serious storage capable nas
devices' to benefit from 10 gig.

anything with usb 3.0 can saturate gigabit.

recent macs push 10 gig to its limits.

just two computers on the network transferring files, and gigabit is a
bottleneck.

furthermore, few people actually upgrade anyway.


That's nothing more than your own personal opinion.


it's a well established industry statistic.

One of which I do
not share, because the experience I have in the field doesn't support
your opinion.


your experience, as with everything else you've done, is extremely
limited, and what's worse is you blindly discard what others have done
and refuse to learn.

Apple users likely don't do much in the way of hardware
upgrades because they are rather limited in what upgrades they can
actually do.


nonsense.

macs have a longer useful life than pcs and can be upgraded when pcs
cannot. that's one of many reasons why the resale value of a mac is
much higher than a pc.

even a lowly $499 mac mini from 5 years ago can have a 10g nic plugged
into it and can utilize all of its bandwidth, thereby extending its
useful life several more years.

a similar pc from that era would need to be replaced.

PC users are a different animal.


nope.

pc users, just like mac users, want to get work done.

users don't give a **** about what parts are inside and when it comes
time to upgrade, they get a new computer because *everything* in it is
outdated, not just one component.

upgrading would entail replacing *all* of the parts piecemeal, which
would cost *more* than buying a new system.

nospam August 12th 17 08:38 PM

Where I keep my spare cats.
 
In article
XnsA7CBEE174B5DBHT1@dr50n7Lg2Q3UT128P02Mrd6V9H9D2 cxu7y3AkI70C48M5kGFxO.
4m50YcRx, Diesel wrote:

AV companies do NOT create malware in the hopes of being the
first to detect something in an effort to boost their sales or
their ratings with joe public or the competition. That's NOT
how it works.

except that one did exactly that, which means that it's
extremely likely they're not the only one who resorts to sleazy
tactics.

Except that the article you cited specifically stated the
'malware' was a dud. IE: bull**** file. NOT Malware. Maybe you
should re-read the material you're going to share, before doing
so.


i did not cite any article about that, nor is there an article to
cite. it's all word of mouth.


You cited an article previously discussing Kaspersky and their fake
malware sample to catch people duplicating their signature
information, actually. You confused that for supporting your
erroneous claim that AV companies produce viruses for the benefit of
their own product over that of a competitors.


it's you who is confused. that link was *in* *addition* to the claim
that an anti-malware company created malware for its own benefit, to
show that as a group, anti-malware companies have resorted and continue
to resort to sleazy tactics to increase sales.

and it ain't just anti-malware companies either. lots of companies do
unethical things to increase sales. some get caught. some don't.

you should read what i wrote more carefully before spouting.


You attempt to backtrack often, eh? How often do you get stuck while
doing so?


i'm not backtracking whatsoever.


Bull**** again. I already told you, I was on a first/last name
basis with many big players in the av/am scene while being an
active vxer. If what you wrote was ever true (and it wasn't,
not ever. Zvi Netiv and Doren Rosenthal do not count, if
that's the 'people' you know) and it isn't, I would know about
it. Hell, I might have been in on it directly or indirectly
via one of the groups I ran with.

they aren't and you weren't.

You'd never know. You didn't even know I'm a former known vxer
until I mentioned it. When I posted the rolling stone article
about me, you shut the **** up for a bit though. ROFL!


i don't give a **** about that article or any other article. you
keep mentioning it as if it's some sort of accomplishment. it's
not, nor does it impress me at all.


When I mentioned a bit about my background, you claimed bull****.
When I provided the article, and, stalker david backed me up, several
times now, you changed your tune to the above. Quite pathetic.


a rolling stone article means nothing. rolling stone is not a tech
journal. nobody reads the rolling stone for tech information, or much
else for that matter.

I didn't. But, I do have to laugh quite a bit at your arrogance.
You don't have enough first hand knowledge to support it.


you're not one to talk about arrogance.


The thing is, I can backup what I say. Where as you, heh, cannot.
That's not arrogance, btw. It ain't bragging if you can back it up.
BFG


except that you haven't backed up a single thing.

just about everything you've said, particularly about apple, has been
shown to be wrong, sometimes *very* wrong.

an amazing track record, and not in a good way.

you think you're hot ****. you're not. there's a lot you *don't*
know and you refuse to learn.


I'm not the one who thinks hes considerably more than he actually is.
I leave such horse**** thinking to yourself.


says the person who has been repeatedly shown to be wrong about all
sorts of stuff and refuses to learn *anything*.

nospam August 12th 17 08:38 PM

Where I keep my spare cats.
 
In article , -hh
wrote:

the reality is that an iphone battery lasts many *years* with
minimal degradation.


That isn't reality.


A PSA "FYI" digression:
Next time you see an iOS with the old style (pre-lightning) plug,
be aware that those devices are all now more than 5 years old.


there's still a lot of those in use, along with plenty of 5/5c/5s,
which are now 3-5 years old.

there are also a *lot* of 3-5 year android devices in use, which use
the same battery technology.


a single desktop/laptop can *easily* saturate a gigabit link
without much effort, making gigabit the bottleneck.


If you're moving some rather large files around using multiple
computers, sure.


Even the latency will clobber you.

A single machine isn't that likely to saturate the
entire 1gigabit switchgear, though.


Gigabit Ethernet = 125 MB/sec
SATA-1 = 150 MB/sec

Oops!


yep.

and then there's sata ii (3gb/s) and sata iii (6 gb/s), along with usb
3.0 (5gb/s) and usb 3.1 gen 2 (10 gb/s), for an even bigger oops.

except that none of that compares with the ssd in the current macbook
pros, which benchmarks around 2.6 to 3 giga *byte*/sec. not
gigabits/sec. *gigabytes*/sec.

https://apple.insidercdn.com/gallery/18862-18409-Extended-l.jpg

that's so incredibly fast that even a 10 gig network connection becomes
a significant bottleneck, let alone gigabit, and that's from *one*
single computer, one that starts around $1200 (or less for savvy
shoppers).

now *that* is an oops.

benchmarks aren't real world usage patterns, but even if typical use is
just a fraction of its maximum speed, it will *still* saturate gigabit
without even trying.

And most home users are not going to move enough data from machine
to machine at one time to saturate a gigabit based LAN.


Where "most home users" is being defined as individuals who
employ DAS instead of NAS because a Gigabit Ethernet connected
NAS is more expensive and slower...right?


yep.

a das would be connected via usb 3.0 (5gb/s), possibly usb 3.1 gen 2
(10gb/s).

Many small businesses depending on the type of business won't
even do that.


Of course not, because the local data storage has been regularly
running on SATA-3 (600 MB/sec) for years now, which means that
Gigabit Ethernet has been a huge bottleneck for just as long.


yep.

also keep in mind that macs use pci-e nvme ssd because sata is much too
slow. see above for 3000 mbyte/s benchmarks, ~5x higher.

Digital movie production, serious photographic work, maybe. And the
latter is pushing it. I've setup audio recording studios that
wouldn't tax 1gigabit based LAN. And that's working with RAW audio
files.


Just because you've anecdotally not had bandwidth problems with GbE
doesn't mean that you can try to look down your nose at others.
For example, a circa 2010 technology "600x" CF card for still
photography was spec'ed at a 90 MB/sec read, which effectively pushed
Gigabit Ethernet to its practical limit...and the new stuff today
(such as Cfast) has specs of 510 MB/s reads, which is 4x the max
theoretical for GbE (and is already ~40% of the max capacity of 10GbE).


yep.

removing that bottleneck is generally a wise investment, one which
will pay for itself fairly quickly in increased productivity. it
might not matter that much for a home user, but it would for smb.


Of course it won't for a home user. A home user isn't going to have
multiple computers taxing a 1gigabit link.


Depends on the home user and their use case ... a single higher end
photo hobbyist can saturate a GbE link on but a single machine.
Contemplate making a 200GB transfer on GbE ... even if it was able
to run at theoretical max, its still a quick half hour time suck.


yep.

while 200 gig files may not be common, 5-10 gig files certainly are,
and even then, the difference is noticeable.

furthermore, few people actually upgrade anyway.


That's nothing more than your own personal opinion. One of which I do
not share, because the experience I have in the field doesn't support
your opinion. Apple users likely don't do much in the way of hardware
upgrades because they are rather limited in what upgrades they can
actually do. PC users are a different animal.


There's been industry studies which do indeed show that hardware
upgrades have become the exception, not the rule ... this isn't
the 1990s anymore.


yep.

And sure, IT-centric hobbyists tend to do more DIY upgrades,
although much of this is driven by economics: they're not able
to write off their expenses like a business does, even when
they had the free cash to spend on it, so there's different
motivations to their behaviors. This tendency tends to be strongest
amongst the Linux "l33t" stereotypes, which also tend to be those
who have the tightest discretionary budget too, so it becomes a
self-fulfilling positive feedback loop.


yep.

Diesel August 16th 17 03:48 AM

Where I keep my spare cats.
 
Eric Stevens
Fri, 11 Aug 2017
04:50:38 GMT in rec.photo.digital, wrote:

On Tue, 08 Aug 2017 16:33:00 -0400, nospam
wrote:

While you may infact be an HLL programmer and possibly a decent
one at that, It doesn't mean you actually have a clue how your
program works at the low level. Or, what you're giving about
yourself away as you churn out program after program.


yet another feeble attempt at an insult and you demonstrate just
how little you actually know about software development at all
levels.


ROFL. Yet, you notice nospam doesn't offer any rebuttle to my post,
other than an poorly attempt at a cheeky insult.


--
https://tekrider.net/pages/david-brooks-stalker.php
http://picpaste.com/B4rjEFK0.jpg - David and Trisha
http://picpaste.com/U5np7XvN.jpg - RIAA love David style



'This is a job for.. AACK! WAAUGHHH!! ...someone else.' -- Calvin


All times are GMT +1. The time now is 10:34 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
PhotoBanter.com