Sciencemadness Discussion Board

25 years later 25000 times more value for the money

franklyn - 23-9-2007 at 12:04

Best way to see how far we've come , is to look back at where we've been.
PCWorld magazine has an article " The Most Collectible PCs of All Time "
Navigation is by the arrows top left or click the thumbs.
http://www.pcworld.com/article/id,136242/article.html?tk=nl_...

Depicted are 19 early endeavors of personal and small business computers.
I'm rounding this to 20 by nominating Clive Sinclair's little 12 once beauty the
ZX-80
http://www.apj.co.uk/zx80/zx80_hardware.htm
http://en.wikipedia.org/wiki/Sinclair_ZX80 <> http://oldcomputers.net/zx80.html




It was not much more than a calculator that hooked up to your television.
The NEC Z-80 cpu ran at 3.25 Mhz. The operating system , BASIC interpreter
character set and editor came on a 4 kilobyte ROM. It Included 1 Kilobyte RAM
expandable to 16 kilobytes for an extra 100 dollars. 8 kilobyte ROM with
extended BASIC another 40 dollars. A cassette tape provided pre-programmed
applications for another 10 dollars. Yes software !

Sinclair had a flair for miniaturization producing the first portable black and
white television which was the size of a paperback book , and sold for 400
dollars in 1980. You could have if you wanted , squinted and used the 2 inch
diagonal screen as a monitor making in effect the first laptop. Then you had
to program Pong in BASIC _ :D




.

Xenoid - 23-9-2007 at 13:50

Hmmmm...!

I've got a couple of dozen 512K Macs, Mac Plus, Apple II, Atari and Amiga computers sitting in the basement. They only cost a few dollars each and all work!

I'm waiting for their value to go up, ....could be a long wait :(

Actually, my only "old" computer that I think has currently any value (at least here in NZ) is my Macintosh Colour Classic which I got for $5. I gather they are worth about $100.

Regards, Xenoid

The_Davster - 23-9-2007 at 14:25

I saved this from somewhere....

[Edited on 23-9-2007 by The_Davster]

pic0348.jpg - 81kB

Nerro - 23-9-2007 at 14:40

In ten years we'll all be laughing just as hard about the 1GB usb-stick that I bought for €5,00 yesterday :P

(holy shit! $199,33/MB!)

Incidentally I heard recordplayers are coming back, apparently the sound is more "personal" from vinyl...

Eclectic - 23-9-2007 at 14:48

I have a dual drive double density 8" floppy drive with high speed voicecoil head positioning that holds a whopping 250k per disk! :D

Now how much would you pay?

Nerro - 23-9-2007 at 15:01

Nothing, we have two old computers in the attick(sp?) that both have a dual 8" floppy drive. Great for playing larry suite leisure :P

Xenoid - 23-9-2007 at 15:16

Quote:
Originally posted by Nerro
Incidentally I heard recordplayers are coming back, apparently the sound is more "personal" from vinyl...


Damn!... I just finished digitising all my old LP's and sold them off! Good riddance, I say.
If "personal" means listening to pops, clicks, rumble and buzzes they can keep it!

I believe Australian and New Zealand produced vinyl was the worst in the world, I've seen LP's with hair pressed in them!

I've still got my record player, a Bang & Olufsen Beogram 4000 tangential tracker, it will still play records (not that I have any now) when tipped up at 45 degrees and a coin placed under the edge of the LP!

Regards, Xenoid

The_Davster - 23-9-2007 at 15:54

I have on occasion hooked up a 5 1/4 inch floppy drive to my laptop. USB to IDE connecters are fun, and let me show people how computer games used to be. Oregon trail I anyone?

EDIT: Yup, I was off by an inch, 5 and a quarter, not 4

[Edited on 23-9-2007 by The_Davster]

Darkblade48 - 23-9-2007 at 16:11

Quote:
Originally posted by The_Davster
I have on occasion hooked up a 4 1/4 inch floppy drive to my laptop. USB to IDE connecters are fun, and let me show people how computer games used to be. Oregon trail I anyone?

Just curious, aren't those 5 inch floppy drives?

I remember playing Oregon Trail as a child too! "You have died of syphillus" was always good for a laugh!

Remember Odell Lake and Number Crunchers? :D

Antwain - 24-9-2007 at 00:01

I think I may be just very slightly too young to remember that kind of stuff. Not long after I was ALLOWED to play on the computers at my house (ie to not break them by pulling them apart to see how they worked) I was playing stuff like commander keen !!!

We also have some old stuff my dad acquired somewhere. We have some portable HDD, which is kind of like a floppy except its not floppy, has a diameter of over half a metre, is about 15cm think in its dust protecting case/cassete, and holds 3mb. We threw out the drive cos it was bigger then the dishwasher LOL.

I also have a working PC XT made of stuff i scavenged in 93 or so, cos i knew ppl would throw it out otherwise (they threw all the rest out). And do you know what the really tragic thing about it is. The 'poor' little 4.747MHz 640kB system loads up DOS 3.1 faster then my 2.8GHz loads up XP by a factor of about 10. Damn you microsoft :P

Darkblade48 - 24-9-2007 at 06:24

Interesting thing is my dad still has a few 10 inch floppies kicking around somewhere. He even has his punch card programs from wayyyyy back :D

12AX7 - 24-9-2007 at 07:53

"Nobody needs more than 640k." - Bill Gates

Antwain - 24-9-2007 at 10:43

these days to run windows 640MB is pushing it :D Although I do believe that the 640k was an expensive upgrade from the 480k.

Ozone - 24-9-2007 at 19:15

I thought Oregon Trail was "You have died of Dysentery" with the damn ox cart, in blazing green on the screen:D.

Zork? Sherwood Forest? Bedlam? Whoo! BRUN Bob to the rescue, fire up the Nibbler!

Potty pigeon was especially memorable for my Commodore 64, then 128-D:D. Wasteland was the best title, period.

I needed to edit autoexec.bat to up the memory to the XM limit of 1024 to run the more advanced SSI games on my 386 DX 33 with the 60mB HD.

@Davster, hey that was a lot of juice back then! IIRC Lotus 1-2-3 was the biggest thing out there and everyone was complaining about the whopping 1Mb it needed!

Hold on, I have got to go take some pictures!

Here (I hope):

http://www.uncommonalchemy.com/Memorabilia.htm

Cheers,

O3

franklyn - 10-2-2010 at 16:02

Who says one can't do world class computation on a personal computer

Personal computers are becoming silly powerful.
Most people still use them for mundane applications
such as word processing , a task for which machines
were already overpowered 15 years ago.


Pi to 2.7 trillion.gif - 27kB

IrC - 10-2-2010 at 19:21

The Timex Sinclair was one of the coolest computers. I had a friend who built a mainframe for a parallel experiment and got married, no time for toys so He gave it to me. I used the mainboards from a dozen to complete it, finished around 1984 IIRC. I had one rack which held 100 ram boards which gave me I think around 100K of memory (no pun but my memory of it is fading). Zane would help from time to time when his wife let him come out to play. Another guy I only knew as Zero Page (refused to ever give up his real name) wrote programs for me to make it all work as I was trying to build a game machine. Bought a C 64 the same year and Fort Apocalypse to play on it and realized I had a useless monster but I did learn a lot building my first parallel processing computer.

That game was so much fun nothing from then until now competes with it. You had to use a "Slick Stick" joystick to play the game due to speed and precision required to make it through alive. In the 90's I started toying with CCS64 on my IBM hoping to play the game again. No way, there was about 1/16 inch in 4 directions to move the joystick and the variable resistor method used in all my modern joysticks are so sloppy it's impossible. I bring this all up for this reason. Does anyone have a circuit design to make such a tight, precise joystick for my 3 GHZ machine as I still have my C64 emulators and games to run on it. Also have a bunch of games like Duke Nukem Plutonium which will not run as the game will not see the IRQ's I can select with any of my newer sound cards and these cards refuse to let me set them to these numbers lower than 10.


hinz - 10-2-2010 at 19:39

That's sick.
Respect to the guy who wrote the program as he must have coded most of it in assembly / used some compiler hacks, because I don't think any normal compiler can handle that great numbers.

Depending on how fast the Chudnovsky algorithm converges to 1/pi, the factorial terms (x*k)! in the Chudnovsky algorithm must have been enormous.
Also converting the 1137GB 1/pi number back to pi (presumably by calculating 1/(1/pi) ) is pretty hardcore. Probably he used the paper division algorithm over the whole 1137 GB by repeated loading of chucks of the enormous number, as it's evn far too big to fit into the RAM.

So probably all the fancy x86 floating point SSE extensions and the 64 bit CPU didn`t help much, only the harddisk size counted.

[Edited on 11-2-2010 by hinz]

barbs09 - 10-2-2010 at 23:05

About 20 years ago Dad rescued a Datapoint (I think) Main frame from our accountant. Apparently it was out dated and was due for the tip. The unit was over 1M NZ dollars new. At the time we reasoned that it might have a few good parts in it... It consists of three units each about a 1.5m x 1m x 1m.

We use them as table legs :)

12AX7 - 11-2-2010 at 08:24

Quote: Originally posted by IrC  
Also have a bunch of games like Duke Nukem Plutonium which will not run as the game will not see the IRQ's I can select with any of my newer sound cards and these cards refuse to let me set them to these numbers lower than 10.


DOSBox, or better yet, http://www.jonof.id.au/jfduke3d

Funny, I just finished playing Shadow Warrior in OpenGL (same site). A few bugs, but works for the most part.

Tim

12AX7 - 11-2-2010 at 08:33

Quote: Originally posted by hinz  
Probably he used the paper division algorithm over the whole 1137 GB by repeated loading of chucks of the enormous number, as it's evn far too big to fit into the RAM.


Probably calculated with bignums.
http://www.google.com/search?q=bignum+division

Quote:
So probably all the fancy x86 floating point SSE extensions and the 64 bit CPU didn`t help much, only the harddisk size counted.


CPUs run so fast these days, they're essentially instant computation surrounded by bottlenecked bottlenecks. The secret to optimization today has almost nothing to do with assembly code, for two reasons: one being sheer speed, the other being that good compilers are only slightly slower than perfect assembly. The real trick is using datasets on the order of cache size, so the (hardware) memory manager can copy more from system RAM to cache, and cache to cache, while the processor is chugging away, waiting a minimal amount of time before more data becomes available.

Tim

psychokinetic - 11-2-2010 at 11:56

I wonder if I could plug my PS2 into that micro television.

All I'd need for portability is a...well.......diesel generator in my backpack....

Still, I've seen some calculators that were less portable than what I've just described :P

woelen - 11-2-2010 at 12:07

Quote: Originally posted by 12AX7  
CPUs run so fast these days, they're essentially instant computation surrounded by bottlenecked bottlenecks. The secret to optimization today has almost nothing to do with assembly code, for two reasons: one being sheer speed, the other being that good compilers are only slightly slower than perfect assembly. The real trick is using datasets on the order of cache size, so the (hardware) memory manager can copy more from system RAM to cache, and cache to cache, while the processor is chugging away, waiting a minimal amount of time before more data becomes available.

Tim

I do not agree with this. In my daily practice as a technical software engineer and software architect I frequently run into problems with only moderate datasets, but still requiring long computation times. A nice commercial example of algorithms using lots of computation power are modern cryptographic methods based on elliptic curves. These methods are augmented with plain symmetric encryption schemes like Rijndael for the simple reason that even modern CPU's do not offer enough computing power to do all the communication with encryption, completely based on PKI-based cryptography.

Another example I have run into is in the robotics area, where a high order state model of a controlled mechanical system is simulated in real time in order to have a model running alongside the real system, where the model can be used to improve control of the real system. More and more is possible with modern CPU's, but still there are many applications where more CPU power really adds to the capabilities of the system.

I agree with you when it comes to domestic applications like word processing, picture editing or even movie watching. In those applications the speed of the hard disk, memory and network more and more are limiting factors, but your statement cannot be generalized to all applications.

I think that there also certainly will be chemists over here, who want faster CPU's for their computations chemistry needs. When it comes to simulating quantum systems, then our modern CPU's are still in their childhood. That kind of computations require HUGE processing power.

12AX7 - 11-2-2010 at 13:19

Quote: Originally posted by woelen  

I do not agree with this. In my daily practice as a technical software engineer and software architect I frequently run into problems with only moderate datasets, but still requiring long computation times.


That may well be true. Computation-intensive functions might run in the fastest level of cache, and there are an awful lot of long-duration manipulations you can perform on 64 thousand bits. On the other hand, data-intensive functions hammer the cache. Right now, folding@home is using about 20MB of RAM, which I dare to presume means it's actively working with a dataset about that size. How it handles it I have no idea. I can compare that to the cache sizes, which apparently are 128k L1 and 512k L2.

Tim

anotheronebitesthedust - 11-2-2010 at 15:49

Anyone remember a game called Rogue?

Panache - 16-2-2010 at 03:18

Quote: Originally posted by woelen  

Another example I have run into is in the robotics area, where a high order state model of a controlled mechanical system is simulated in real time in order to have a model running alongside the real system, where the model can be used to improve control of the real system. More and more is possible with modern CPU's, but still there are many applications where more CPU power really adds to the capabilities of the system.


Is that how they got that creepy headless reindeer to perform so well?
http://www.youtube.com/watch?v=W1czBcnX1Ww

I have pondered that since i saw it a couple of years back. The walking mechanism can manageably be programmed no doubt (most likely from actually bluescreen footage of a deer with dots on it) and modern cpu's can run those algorhims quickly enough to be effective but the speed the feedback loops for this things cut in at are amazing, see it recover when the operator kicks it.
I hate knowing so little about a technology that i cannot even broadly describe something pertaining to it.

woelen - 16-2-2010 at 04:25

This is one of the methods to achieve finer control over robotic systems. The mechanism is that a real robotic system has a number of sensors, which measure some states of the system. By means of a mathematical model, then other states can be derived. One method is to simulate a dynamic model in real time, which provides access to all relevant system states. The states in such a model of course would drift away over time from the real system states. Measurements in the real system then are used to keep the states of the dynamic model near the real states. The nice thing is that also unmeasured states can be kept near the real states and this extra information then can be used to achieve finer/better control.

A whole branch of mathematics has been developed and its main goal is to determine observability and controllability of states which are not directly measured or controlled. For linear system the math is not that difficult, but for non-linear systems, such as that headless reindeer, the underlying math can be extremely complicated and quite powerful systems are needed to do the realtime computations.

The way it was then

franklyn - 15-6-2010 at 05:06

The Internet as imagined in 1969
http://www.youtube.com/watch?v=Y0pPfyYtiBc

20 years later an early realization
http://www.youtube.com/watch?v=yFF0oQySsh4

.

densest - 15-6-2010 at 07:21

@woelen - the modeling you mention sounds very interesting - are there any key books or papers or people to google for to get an idea where that field is going?


watson.fawkes - 15-6-2010 at 07:51

Quote: Originally posted by densest  
are there any key books or papers or people to google for to get an idea where that field is going?
The terms observability and controllability are concepts from control theory. Start with the "control theory" page on Wikipedia. One of the references at the bottom of that page is this paper: Modern Control Theory - A historical perspective. I've skimmed it and it looks like a solid introduction. I find historical survey papers such as this one are enormously useful, as they help me categorize what I read into the various schools of thought and thus to orient myself within an unfamiliar field of research more rapidly.

woelen - 16-6-2010 at 01:49

Another interesting read may be about the so-called Kalman filter:

http://en.wikipedia.org/wiki/Kalman_filter

It uses a dynamic model of a system to keep (noisy and inaccurate) measurements and a simulated model close to each other. This has the advantage that noise is removed (the model has no noise, only the real measurements have noise). It also has the advantage of being able to use additional states from the model, which are not measured at all. Of course, one must be careful with the latter application and be sure that the underlying dynamic model is a sufficiently good approximation of the real system for the purpose you are using it.

Be prepared though that this is not easy math! You need university level calculus in order to understand the theory behind the Kalman filter, non-linear dynamic systems models and the statistics incorporated in the method.


The concepts of controllability and observability are explained here and worked out for linear systems:

http://en.wikibooks.org/wiki/Control_Systems/Controllability...

This theory is not that difficult. Some basic linear algebra, however, is needed for understanding this.

For non-linear systems the situation may be much more complex. Many mechanical systems in robotics are so-called 'non-holonomic' systems, which means that the number of meaningful controllable freedoms is less than the total number of degrees of freedom. A nice example is a car. In 2D-space (e.g. a floor), it has three degrees of freedoms (position in x and y-direction and orientation). There are however, only two meaningful controls, being force in forward/back-direction and steering angle. A car cannot move sideways directly but it can be done indirectly, using the other two controls (e.g. driving forward for 10 meters and then driving backwards and in the meantime steering somewhat, the classical car parking problem).

http://en.wikipedia.org/wiki/Nonholonomic_system


A very simple model demonstrating non-holonomy is this model of a robotic cart with position (x,y) and orientation phi, which can move forward/backward with velocity v in the direction of its orientation and which can rotate around its axis with angular velocity w:

Three degrees of freedom:
dx/dt = v*cos(phi)
dy/dt = v*sin(phi)
d(phi)/dt = w

Only two meaningful controls:
dv/dt = k1*F - R1*v
dw/dt = k2*T - R2*w

You apply a force F in forward or backward direction, which leads to an acceleration dv/dt or you apply a torque T, which leads to a rate of change of angular velocity w (angular acceleration dw/dt). Here, k1 and k2 simply are constants, depending on the mass, rotational inertia and construction of the cart (e.g. k1=1/M and k2=1/J, M is mass and J is inertia). Here, R1 and R2 are friction-modelling damping constants, which assure that if no force and torque are applied that the car will come to a standstill.

Just play around with this model in order to get a feeling of the concept of non-holonomy. Use k1 = k2 = R1 = R2 = 1 for simplicity. You'll see that although you just have two controls, any state (x, y, phi) can be reached, starting from any other initial state. The practical path from one state to the other may be hard to find though.

In robotics this type of phenomenon is very common. It can be present in the internal workings or in its motional freedom.




[Edited on 16-6-10 by woelen]

franklyn - 14-11-2011 at 10:41

The first and second highest performing supercomputers are now in the far east.
Yes but can you play Pong on them?

1 - Japan
The K Computer, built by Fujitsu, has 68,544 central processing units, each with
eight cores , housed in over 800 racks , it is capable of performing more than 8
quadrillion calculations per second.

2 - China
The Tianhe-1A uses just 14,336 central processing units , because unlike the K
computer, it has 7,168 NVIDIA graphics processors to accelerate computation
and is capable of achieving speeds of 2.57 quadrillion of calculations per second.

who says the PC is dead :)

.

IrC - 14-11-2011 at 11:29

"I'm rounding this to 20 by nominating Clive Sinclair's little 12 once beauty the ZX-80"

I remember the great day I had enough to get my shiny new 8K of ram. Money was tight and damn hard to come by during the Carter years. I built a rack in 1984 using 10 main boards from the Timex Sinclair ZX-80 for my first parallel processing experiment. You are right the article is very lacking to omit this amazing for the time machine. Nor do they include the TI 99/4A, the one I wrote my first video game in basic on in 1981. Another one which should have been on the list making it 21.

This happens when people talk about the past having never lived it.


AndersHoveland - 14-11-2011 at 16:34

Perhaps someone should have started a thread about the cost of housing: "25 years later and 3 times less value for the money". The only improvements there have been in the last 25 years are technological innovations. Other than that, everything else in society has just gotten worse. The story of humanity I suppose.

dann2 - 14-11-2011 at 18:13


Is it true that the estimated computing power of a leech is thrillions of time more that the best supercomputer?
If so, it's a pity it cannot be harnessed.

AndersHoveland - 14-11-2011 at 18:35

Quote: Originally posted by dann2  

the estimated computing power of a leech is thrillions of time more that the best supercomputer?
If so, it's a pity it cannot be harnessed.


what exactly are you proposing?

Perhaps in the future all the newest computers will have aquariums built inside! :P

Here is what a computer "mouse" might look like:


Polverone - 14-11-2011 at 19:32

Quote: Originally posted by dann2  

Is it true that the estimated computing power of a leech is thrillions of time more that the best supercomputer?
If so, it's a pity it cannot be harnessed.


I think this is a confused interpretation, like someone estimating how much computing power it would take to model a leech and then mistakenly referring to this estimate as the computing power "of" a leech. Likewise, it would take vast amounts of computing power to model all the chemical changes taking place in a burning cigarette, but it would be mistaken to refer to the vast computing power of a cigarette. Computing and things-that-can-be-modeled-by-computing are not generally commensurable.

dann2 - 14-11-2011 at 19:41


I thought it meant the 'computing power' of its brain.
It would take many many supercomputers (if all the sensors and actuators could be built) to do what the leech can do. Sense it's surroundings, move, mate:cool: etc etc.
I am not standing by this. I have not done the calculations necessary to prove it!

It is rather difficult to compare and measure a leech's brain (does it have one?) to a computer.


[Edited on 15-11-2011 by dann2]

White Yeti - 28-12-2011 at 08:30

Quote: Originally posted by dann2  

It is rather difficult to compare and measure a leech's brain (does it have one?) to a computer.


It drives me absolutely insane when people compare a brain to a computer. They are not one and the same. A digital computer is flawless, can perform mathematical calculations instantly, and can run programs. A brain is flexible, adaptable and imperfect, thus, unable to compute without using a roundabout method. You cannot compare the two.

There is one danger associated with the way we store data and information nowadays with computers. We live in an age of immense progress in computer technology, but we keep changing the way we store data. Take a cassette tape, or a phonograph cylinder for example. Those technologies are antiquated and pretty soon we will no longer have the equipment needed to view data stored on those storage media. As we keep changing the way we store data, we no longer have records of the past because much of the data is lost.

There is one technology that is reliable no matter what time period you live in, that's the book. The book does not need electricity, it does not need any external mechanism to retrieve data (other than your eyes and your brain) so it can be passed down from generation to generation. Even though we have computers and other gadgets, the book remains the most reliable record of the past.

AndersHoveland - 28-12-2011 at 11:32

Unfortunately, books are made of paper. Books only last up to 400 years under normal conditions. No one is going to take the trouble of storing books in the absence of oxygen and moisture for posterity. Ironically, the clay tablets from ancient civilizations will far outlast modern knowledge which is stored on paper or disk.

[Edited on 28-12-2011 by AndersHoveland]

White Yeti - 28-12-2011 at 18:01

Quote: Originally posted by AndersHoveland  
Unfortunately, books are made of paper. Books only last up to 400 years under normal conditions. No one is going to take the trouble of storing books in the absence of oxygen and moisture for posterity. Ironically, the clay tablets from ancient civilizations will far outlast modern knowledge which is stored on paper or disk.


I wouldn't be so sure, books can last much more than 400 years. It depends on the climate in which they are stored. If they are stored in the arid climates of the mid east, they could easily outlast civilisations, If they were stored in my basement, they wouldn't last more than 20y. Unfortunately, books are destroyed by fire. Fires in libraries are a significant reason why much information has been lost over centuries.

[Edited on 12-29-2011 by White Yeti]

Dr.Bob - 30-12-2011 at 11:05

I have some old core memory boards that might be readable... That ties together the part of the thread about old computers and the archival storage issues....

Does anybody have an old PDP-11 that needs a memory upgrade? I think each board holds a whopping 1K! They are somewhere in my basement under a pile of lab supplies. I also have a power supply for one. That thing was so big that it will hold 12V for several minutes of (a small) load after being unplugged. Who needs a UPS when you have farads of capacitance?


Spart - 30-12-2011 at 11:40

The reason why those old computers are so expensive in such a small amount of time is due to Moore's Law, which states that approximately every 2 years, computer power doubles. This includes stuff such as processing speed, memory capacity, pixel number and size in digital camera pictures, and more. It is quite fascinating to think that only after 2 years, humanity's progress with computing hardware doubles.

White Yeti - 30-12-2011 at 15:52

Quote: Originally posted by Spart  
The reason why those old computers are so expensive in such a small amount of time is due to Moore's Law, which states that approximately every 2 years, computer power doubles. This includes stuff such as processing speed, memory capacity, pixel number and size in digital camera pictures, and more. It is quite fascinating to think that only after 2 years, humanity's progress with computing hardware doubles.


That only applies to silicon based processing units, but once we exhaust silicon's potential, the law will no longer apply. Exponential growth must stop at one point or another.

woelen - 31-12-2011 at 04:47

I think that we already are at the end of silcon's potential. Look at the clock frequencies of processors. They only grow marginally nowadays, due to heat issues. We still do see progresses, but now that's in the number of cores, less power usage and more efficiency in processing CPU instructions (less cycles per instruction). But this increase in processing power also will soon find its end.

I do not, however, expect that with the end of the potential of silicon chips there will be an end of the increase of processing power. There may be a (short) period of stagnation, but soon other technologies will take over. Promising technologies are light-based computing, use of polymers for computing devices, use of nano-technology and in the somewhat further future even bioelectronic devices may appear.

White Yeti - 31-12-2011 at 08:29

Quote: Originally posted by woelen  
I think that we already are at the end of silcon's potential. Look at the clock frequencies of processors. They only grow marginally nowadays, due to heat issues. We still do see progresses, but now that's in the number of cores, less power usage and more efficiency in processing CPU instructions (less cycles per instruction). But this increase in processing power also will soon find its end.


Silicon's potential is close to being exhausted, but technological developments are still enabling us to make CPU's ever faster and more powerful. The transistors nowadays are as small as they're going to get, but now IBM is building 3D CPU sammiches:
http://arstechnica.com/hardware/news/2008/06/ibm-demonstrate...

I'm waiting for the development of graphene computers, where transistors can be scaled down to just a few atoms wide. Plus, the added bonus is that graphene is an amazing conductor of heat, making cooling of 3D chips all that much simpler.

Quantum Computing

Diablo - 13-2-2012 at 11:40

I can't wait to see quantum computer, where a single atom can hold a value of 0,1 or everything in between.

GreenD - 13-2-2012 at 13:52

Quantum computing brings up very interesting philosophical perspectives.

According to the field, there is no possible way to predict any system with absolute precision or accuracy without the predicting device (quantum computer) being the exact system.

It seems straight forward; the only way to perfectly predict the weather is to "be" the weather, however - this also obviously displays that predicting the future accurately is impossible (of course, the closer to the present you predict, the more accurate you are). This has a direct consequence in determinism...

White Yeti - 13-2-2012 at 15:34

I have a prehistoric kitchen scale the flintstones might have used to make cookies, should I post pictures? It goes up by increments of 10g but I still use it, because quite frankly, I don't have anything better at the moment.

Pyro - 28-4-2012 at 15:21

lol,
15 Mb for 4000 dollars, now you buy a tiny little stick with 16Gb for 30 bucks! and a dollar was worth a lot more then.

franklyn - 11-1-2013 at 22:47

http://www.sciencemadness.org/talk/viewthread.php?tid=17362&...

White Yeti - 12-1-2013 at 08:45

An explanation would have been more useful than a broken link IMO.

franklyn - 14-5-2013 at 11:58

www.extremetech.com/extreme/155636-the-bitcoin-network-outpe...

.

phlogiston - 14-5-2013 at 15:15

Quote:
I also have a working PC XT made of stuff i scavenged in 93 or so, cos i knew ppl would throw it out otherwise (they threw all the rest out). And do you know what the really tragic thing about it is. The 'poor' little 4.747MHz 640kB system loads up DOS 3.1 faster then my 2.8GHz loads up XP by a factor of about 10. Damn you microsoft



An example of a trend that has been observed since 1987 and is known as Wirths law:

"software is getting slower more rapidly than hardware becomes faster."

woelen - 14-5-2013 at 22:52

This is not a good comparison. A well-tuned system boots faster than a system, which has become loaded with tons of software and services. Windows XP unfortunately tends to become slower and slower over time. Newer Windows versions and also most Linux distributions boot much faster.

Another thing to keep in mind is functionality. Systems nowadays can do things which were unimaginable 20 years ago. Drivers and services, needed for being able to use this functionality also have to be started or loaded.

12AX7 - 15-5-2013 at 18:09

I keep my XP laptop booted and use standby... bypasses the one problem.

I suppose that begs the question, how long until standby itself is bloated by excessive BIOS checks and functionality? As it stands, it's far from instantaneous: usually a few seconds. In principle, it should be doable within around 100ms, most of which is spent waiting for power supplies to stabilize (which have time constants in the 1-10ms range).

Can't say I've seen a computer do less than several minutes on conventional hardware and software (even with 'optimized' installs). Apparently SSDs do wonders, though.

Tim

woelen - 16-5-2013 at 01:56

I have a Shuttle XH61V (a very small barebone PC) with Core i3-3225 and a Samsung 840 PRO 256 GByte SSD and 8 GByte of DDR-1600 RAM. This is not exceptional hardware, actually it is fairly low end, except the SSD. I installed Ubuntu 12.10 on this and it boots in only 7 seconds (time from switch on until I can work on the system, i.e. the desktop is up-and-running and ready for use).

The SSD is very fast (appr. 100000 IOPs per second for read and 90000 IOPs for write, transfer rate 530 MByte/s). This makes a huge difference with normal harddisks.

franklyn - 28-5-2013 at 22:01

www.wired.co.uk/news/archive/2013-05/17/quantum-computer

franklyn - 3-7-2013 at 20:08

www.extremetech.com/computing/160367-new-programming-languag...

Eliteforum - 4-7-2013 at 03:18

Meh, computers haven't got to the point where I'd like them to be at yet.

<img src="http://i.imgur.com/atU1AcO.jpg" width="600"

Waiting on an order from Butterfly Labs..

<!-- bfesser_edit_tag -->[<a href="u2u.php?action=send&username=bfesser">bfesser</a>: reduced image size(s)]

[Edited on 7/16/13 by bfesser]

A quiet revolution

franklyn - 9-7-2014 at 01:10

It was noted May of last year that bitcoin mining exceeded the combined computational power of the world's top 500 supercomputers by 8 times
www.extremetech.com/extreme/155636-the-bitcoin-network-outpe...
By 6 months ago bitcoin mining was 256 times greater.
www.forbes.com/sites/reuvencohen/2013/11/28/global-bitcoin-c...

https://coinreport.net/mining-bitcoin-survival-fastest

Bitcoin mining rigs ( ASIC , Application Specific Integrated Circuit , devices )
www.spondoolies-tech.com/products/sp30-yukon-september-batch... ( 6 tera hash / sec )
www.best-miner.com ( 1 tera hash / sec )

www.bitcoincharts.com/charts/bitstampUSD#rg730ztgOzm1g10zm2g...
www.sciencemadness.org/talk/viewthread.php?tid=14592&pag...
www.sciencemadness.org/talk/viewthread.php?tid=17281&pag...

In the near term and likely for the foreseable future until the exhaustive ending of the mining protocal , the purchase of dedicated machines for several thousand dollars which consume the power of a large airconditioner and can do nothing else , will contribute only additively as the difficulty of mining grows faster than the technological ability to scale ever greater computation. With demand exceeding creation , bitcoin can only increase in value.
www.youtube.com/watch?v=5CjldZLXiAU


.

Praxichys - 9-7-2014 at 05:57

Quote: Originally posted by woelen  
I have a Shuttle XH61V (a very small barebone PC) with Core i3-3225 and a Samsung 840 PRO 256 GByte SSD and 8 GByte of DDR-1600 RAM. This is not exceptional hardware, actually it is fairly low end, except the SSD.

I wouldn't call that low-end. My parents still run windows ME.

My personal rig is pretty good-

Core i5 (haswell) @ 3.40 GHz
16GB DDR3 2400 RAM
256 GB Samsung EVO SSD
2TB SATA 3 HDD
EVGA GTX 760 w/ 4GB VRAM
64 bit Win7 Pro

This thing absolutely smokes every game I own. It can run a minecraft server where I have told 5 or 6 people to deliberately attempt crashing it by placing thousands of blocks of TNT and blowing them up... all their clients crash, and my client AND server console stay running on the same computer. I do have a game (Planetary Annihilation) that can hover at 9GB RAM at times, but it is in beta and probably has yet to be optimized.

And yes, I remember upgrading my first PC to a 5.25 GB HDD back in 1998. Now my phone has a removable 32GB microSD flash chip the size of my baby fingernail, which cost about $35.

They're here, Bitcoin ATM's

franklyn - 13-7-2014 at 21:28

http://ktla.com/2014/07/07/hands-on-with-one-of-the-first-bi...
http://thetechreport.tv/2014/07/07/bringing-digital-currency...
www.digitaltransactions.net/news/story/How-a-Digitally-Minde...

.

franklyn - 2-7-2015 at 13:12

Some years ago I bought a 2 GB USB thumbdrive for $ 85

This now provides 100 X the space for just 3 X the price.

http://thehackernews.com/2015/06/200gb-microsd-card.html

.

gregxy - 6-7-2015 at 10:15

All "Technology" is driven by Moore's law, which has been advancing like
clockwork for 50 years!

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years."
G. Moore, 1965

Production is now starting on designs with 14nm feature sizes. 7nm devices are under development. These are still being printed with UV light with a wavelength of 193nm. Over the past few years there has been a shift from planar MOS devices to fin shaped MOS devices which offer higher transconductance with lower leakage. This has allowed Moore's law to continue but greatly complicated the design and fabrication process.

http://www.techdesignforums.com/practice/technique/advanced-...

Oscilllator - 26-7-2015 at 23:33

Quote: Originally posted by gregxy  
All "Technology" is driven by Moore's law, which has been advancing like clockwork for 50 years!

I'm not so sure about this. In 2011 the samsung galaxy s2 came out with a 1.2GHz processor. So according to moors law a top of the line phone in 2015 should be 2^4 = 16 times faster, or ~19.2GHz. Now obviously this isn't even remotely close to being the case - the galaxy s6 edge clocks in at 2.1GHz, not even double the s2.

I understand that in the smartphone market power usage is also a big factor, but still. It seems like Moore's law stopped quite a while ago.

[Edited on 27-7-2015 by Oscilllator]

Fulmen - 27-7-2015 at 00:11

There is much more to CPU performance than clock speed. The number of cores for instance, and as you said there is the question of power consumption. Reducing the power consumption in half counts as a doubling of performance.

One should also understand that Moore's Law is empirical in nature, and to a great extent a self-fulfilling prophecy. It has turned into an industry-wide goal as everybody assumes that their competitors will achieve similar growth.

[Edited on 27-7-15 by Fulmen]

careysub - 27-7-2015 at 09:10

Quote: Originally posted by Oscilllator  
Quote: Originally posted by gregxy  
All "Technology" is driven by Moore's law, which has been advancing like clockwork for 50 years!

I'm not so sure about this. In 2011 the samsung galaxy s2 came out with a 1.2GHz processor. So according to moors law a top of the line phone in 2015 should be 2^4 = 16 times faster, or ~19.2GHz. Now obviously this isn't even remotely close to being the case - the galaxy s6 edge clocks in at 2.1GHz, not even double the s2....


gregxy helpfully provides Moore's actual law, as formulated originally by Moore himself, not the many paraphrases or corollaries derived from it. Moore does not mention clock-speeds at all, only component costs. And that 'law' stays on track, more or less, (it has slowed down a bit from the mid 1960s, but the trend has remained steady).

As Fulmen points out there are more ways of increasing processing power than the misleading one-dimensionsal "clock speed" (some may remember the PowerPC vs Wintel marketing war in the 1990s, when PowerPC chips that did more work per cycle were at a disadvantage to the nominally faster but actually slower Intel chips).

Since raw clock speeds have flattened pipelining and hyperthreading have shoved more processing into each clock cycle, and of course we also get multiple cores on a single chip.

And as Fulmen also pointed out, the amount of computation per watt becomes increasingly important. We can only get "brain equivalent" computers by drastically lowering the power consumption per operation.

It is helpful to consider the operating characteristics of what remains the most powerful processing system we have: the human brain. Its "clock speed" is extremely slow, well below 1 kps, but is massively parallel, and extremely power efficient.

Our artificial computation technologies have an immense speed advantage, but have to evolve in the same direction before they can compete with natural neural networks.

Corner the market in bitcoin

franklyn - 17-11-2015 at 19:12

www.pcworld.com/article/3005414/computers/intel-plugs-72-cor...

IrC - 17-11-2015 at 19:34

2 years and I still don't know Eliteforum. Is this what one uses for bitcoin mining?

atU1AcO.jpg - 383kB

Over the years I have used a pick and shovel to mine Gold, Silver, Uranium, even diamonds. However none of it was near as fun as it looks like you are having in this room.

They're here

franklyn - 9-12-2015 at 08:28

http://gizmodo.com/google-claims-to-have-proved-that-its-qua...

www.computerworld.com/article/3013102/high-performance-compu...

http://arxiv.org/abs/1512.02206