• Announcements

    • Cap'n Refsmmat

      SFN Upgraded   07/22/17

      SFN has been upgraded to IPB version 4. View the announcement for more details, or to report any problems you're experiencing.
Elite Engineer

What are some of the causes of computer's to breakdown over time...say 5-6 years?

40 posts in this topic

Aye. There'll come a point where I start fossilising into certain routines. In 15 years I'll be 70.

Yes, me too. I'll be 57 in 15 years.

 

As for computer breakdown causes it depends on the geography... heat, dust and the the user are the main causes. There arent many mechanicaly moving parts is PC's and laptops so prolonged parts heat exposure is the main reason. For examole for printers which contain a lot of moving parts the main cause of faults is the user, about 70% of all service calls. When getting a PC it is always a good idea to get a good quality heatsink and radiator for the CPU and video card. It will cost 20 bucks more but might give you years more of work time. And its crucial to keep the dust away, Ive seen equipement break down 10 times quicker in desert, dusty areas.

0

Share this post


Link to post
Share on other sites

If electronic components survive the first few days, they'll last forever...

As long as they have well regulated supply voltages, and are kept in the allowed temperature range.

Mechanical systems, however, tend to deteriorate over time.

 

I noticed Sensei uses a CoolerMaster power supply: It is a 'name brand' which provides good voltage regulation/transient protection.

If he also keeps his system clean and well ventilated, the electronics in his computer will last long after his hard drives have crashed, his connectors have deteriorated, and its been slowed/bloated by malware/adware/Trojans/viruses.

 

If you consider a second generation Intel i5-2xxx, you will find that it is not much slower than a seventh generation i5-7xxx.This is because Intel hasn't had any 'real' competition since the AMD Athlon, and have concentrated on reducing process size/power consumption. A 'decent' system from 6 yrs ago will keep up with most new systems except in power consumption ( laptops hold charge longer ) and gameplay ( as new graphics cards are massively more parallel ).

With the introduction of the AMD Ryzen architecture, we have real competition again, and our systems may become obsolete much quicker as competition leads to quicker advances.

0

Share this post


Link to post
Share on other sites

 

Except when you don't have much money and a machine has to last a while. Especially when performing only common tasks such as web browsing and some office applications it seems ridiculous to keep upgrading.

 

I have a PC with a 3400 APU and 6450 GPU. Bought five years ago.Very low end. Works fine for said tasks, and will likely continue to work fine for said tasks for several more years.

 

I agree +1

 

I have one bench machine I built some years back that came with a P3 and Windows 98.

After a several years I installed Windows 2000.

I keep it as a dedicated machine for writing CD and DVDs

No modern machine can do any better because the CD interface limits the throughput.

 

 

If electronic components survive the first few days, they'll last forever...

As long as they have well regulated supply voltages, and are kept in the allowed temperature range.

Mechanical systems, however, tend to deteriorate over time.

 

I noticed Sensei uses a CoolerMaster power supply: It is a 'name brand' which provides good voltage regulation/transient protection.

If he also keeps his system clean and well ventilated, the electronics in his computer will last long after his hard drives have crashed, his connectors have deteriorated, and its been slowed/bloated by malware/adware/Trojans/viruses.

 

If you consider a second generation Intel i5-2xxx, you will find that it is not much slower than a seventh generation i5-7xxx.This is because Intel hasn't had any 'real' competition since the AMD Athlon, and have concentrated on reducing process size/power consumption. A 'decent' system from 6 yrs ago will keep up with most new systems except in power consumption ( laptops hold charge longer ) and gameplay ( as new graphics cards are massively more parallel ).

With the introduction of the AMD Ryzen architecture, we have real competition again, and our systems may become obsolete much quicker as competition leads to quicker advances.

 

 

Sorry I can't agree that materials science is that good.

 

This was the fallacious dream promoted by the early introducors of semiconductor devices.

 

Certainly materials science has improved dramatically since the tin whisker dendrites of the early germanium transistors, but it is far from immortal.

Solid state diffusion still occurs, albeit at a reduced rate.

0

Share this post


Link to post
Share on other sites

Who knows? There might be 3D holographic projections which you touch or wave your arms about in utilities to do stuff and that may require power that's not available yet. We are probably good for 10 years.

 

They can keep gimmicks like that, thank you very much :P Joking aside, it's possible, but requires breakthroughs that likely won't be ready for the consumer market any time soon. Not to mention that they have to catch on.

 

I speculate that everything will be done in the cloud and the only state of the art piece of hardware in our computers/devices will be a super fast WiFi card.

Oh lord, please no :( Would be the end of my computer hobby :(

 

I keep it as a dedicated machine for writing CD and DVDs

That's not a bad idea.

 

This was the fallacious dream promoted by the early introducors of semiconductor devices.

True, but they can still last decades. My Amiga is a good example.

Edited by Thorham
0

Share this post


Link to post
Share on other sites

 

They can keep gimmicks like that, thank you very much :P Joking aside, it's possible, but requires breakthroughs that likely won't be ready for the consumer market any time soon. Not to mention that they have to catch on.

You perhaps remember a time when MS-DOS diehards thought Windows98 was a gimmick or even Linux nerds that worked the same sort of command line interface.

Edited by StringJunky
0

Share this post


Link to post
Share on other sites

You perhaps remember a time when MS-DOS diehards thought Windows98 was a gimmick or even Linux nerds that worked the same sort of command line interface.

 

Not to mention that they have to catch on.

 

 

The second point is the whole point.

 

I remember a time when VHS and Betamax fought it out and the worst man won.

 

Today I have just had the most unfortunate buying experience with a once proud British company.

 

I tried to buy a 2017 model television.

 

Despite their website's blandishments, yes they have one, in Edinburgh, but they won't move it to their store in Dundee.

 

My trade supplier has sold out and has no plans to replenish stock.

 

Why?

 

Because it is 'obsolete' ie not 4k.

 

But, because I am obstinate I located a stock of these in Prague, delivery to Dundee

 

No problem

.

 

Why am I obstinate?

 

Well look at the BBC website.

 

They do not currently broadcast anything in 4k and have no plans to do so in the forseeable future.

 

So I am buying an 'obsolete' 2017 model to receive current and future transmissions as they are actually broadcast.

 

Who deserves the business?

Edited by studiot
0

Share this post


Link to post
Share on other sites

 

 

The second point is the whole point.

 

I remember a time when VHS and Betamax fought it out and the worst man won.

 

Today I have just had the most unfortunate buying experience with a once proud British company.

 

I tried to buy a 2017 model television.

 

Despite their website's blandishments, yes they have one, in Edinburgh, but they won't move it to their store in Dundee.

 

My trade supplier has sold out and has no plans to replenish stock.

 

Why?

 

Because it is 'obsolete' ie not 4k.

 

But, because I am obstinate I located a stock of these in Prague, delivery to Dundee

 

.

 

Why am I obstinate?

 

Well look at the BBC website.

 

They do not currently broadcast anything in 4k and have no plans to do so in the forseeable future.

 

So I am buying an 'obsolete' 2017 model to receive current and future transmissions as they are actually broadcast.

 

Who deserves the business?

I said to myself "Bah, humbug" to TV 10 years ago and haven't had one since. You could rig a 4K HDMI TV to your computer and use it as a monitor.

Edited by StringJunky
0

Share this post


Link to post
Share on other sites

Im gonna get me one of these 6K 80 inch TV's or a state of the art projector one day to watch nature/space films/documentaries with my son. I bought a 720p pioneer plasma 10 years ago when I had the money, spent a fortune on it and I think I made a great decision...I use it for movies, work, PS3 when kids come over - its a great device with beautiful color rendering even to this day.

0

Share this post


Link to post
Share on other sites

I like to think this is more of a hardware issue, and more specifically a chemistry issue.

 

Do the busses in the motherboard just wear out over time due to constant current? CPU eventually warps over time?

 

I dont know much about computers, so sorry if my theories seem really fundamental.

 

~ee

 

A software issue is the most common fault with computers but lets focus on hardware issues. A popular cause of hardware fault is dust build-up - The fan sucks in dust which clogs up the heatsink which impedes the ability to keep the CPU cool. A CPU will fail when it gets too hot so there's a thermal safety feature which causes the computer to switch off when it reaches a certain temperature like 90°C. Obviously fans can fail as well which will cause the computer to overheat etc.

 

To answer your question properly, I'll focus purely on the hardware without any external influences such as software, dust, power surges, fan failure, manufacturing defects, human intervention etc.

 

Imagine a computer in a cleanroom environment. What could cause it to fail?

 

1. Electrolytic capacitors can fail.

2. A computer gets warm when you switch it on, cools when you switch it off, gets warm when you switch it on, cools when you switch it off. All the metal parts are expanding and contracting, expanding and contracting. Some metals expand more than others due to differences in the forces between the atoms. Solder joints get weak etc. A computer should actually last longer if you never switch it off because the metal wont contract.

 

I'm ignoring hard drive failure because that's not computer failure. My first computer didn't even have a hard drive.

0

Share this post


Link to post
Share on other sites

This is a forum thread not a formal presentation.

I'm sorry your humour circuits were not engaged this morning.

 

I expect you can find errors in the few posts I have made here. I would be happy to have any and all of them pointed out, even although none of them were in formal presentions. Clarity and a respect for the language make me demand it of myself. I'll gently chide oversights by others if they are in banner headlines in a title and the correction can be applied with some light humour.

0

Share this post


Link to post
Share on other sites

You perhaps remember a time when MS-DOS diehards thought Windows98 was a gimmick or even Linux nerds that worked the same sort of command line interface.

I still switch to text mode for programming or using software that requires a multitude of parameters (hundreds of menus and buttons can be a real pain in the arse when a simple command will do the job).

0

Share this post


Link to post
Share on other sites

I still switch to text mode for programming or using software that requires a multitude of parameters (hundreds of menus and buttons can be a real pain in the arse when a simple command will do the job).

I suppose, doing it the long way gives you an appreciation of how many steps modern stuff actually does in the blink of an eye now.

0

Share this post


Link to post
Share on other sites

I suppose, doing it the long way gives you an appreciation of how many steps modern stuff actually does in the blink of an eye now.

 

I do things in text mode when it's faster to do it that way or because software does not exist to do what I want.

It's very often quicker to write a script for a one off job than it would be to search internet for software that does something vaguely similar and then have to hack it anyway to do what I want.

0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now