Ramen Open Source Compositor 64 Bit For Mac

The original Amigas, up to at least the A2500 and/or A3000, didn't have an MMU, and I don't think the A4000/030 had one either (the A1200 most certainly did not). Memory protection without an MMU would have been impossibly hard.ETA: I don't recall if the A3000 used an EC030 or the full version.The Amiga 2500s, the Amiga 3000, and the regular Amiga 4000 had MMUs. The Amiga 4000/30 apparently did not.Commodore's software and tech support folks had gone a very long way toward solving the problem of memory protection with development tools like Enforcer. You got an MMU protected development environment that flagged memory errors, leaks, etc.

A free and open-source graphics device driver is a software stack which controls computer-graphics hardware and supports graphics-rendering application programming interfaces (APIs) and is released under a free and open-source software license. Graphics device drivers are written for specific hardware to work within a specific operating system kernel and to support a range of APIs used. Ramen is a node-based image and video compositor. The Linux version should be easy to compile and run on any Unix-like operating system with few minor.

Coupled with this, the CATS folks ensured that all magazines got copies of Enforcer. So by the later 1980s, it was a huge problem for a company to release a program that displayed 'enforcer hits'.And sure, a protected environment (and there were ways to do it) would have made the system more robust on MMU-based Amigas, though it's still far better to release bug-free code than to just count on memory protection. Windows PCs still get BSODs, after all. And this helped quite a bit, in those days, to keep non-MMU machines reliable as well. I had my Amigas runinng for months at a time in those days when NOT developing code.

The original Amigas, up to at least the A2500 and/or A3000, didn't have an MMU, and I don't think the A4000/030 had one either (the A1200 most certainly did not). Memory protection without an MMU would have been impossibly hard.ETA: I don't recall if the A3000 used an EC030 or the full version.The Amiga 2500s, the Amiga 3000, and the regular Amiga 4000 had MMUs. The Amiga 4000/30 apparently did not.Commodore's software and tech support folks had gone a very long way toward solving the problem of memory protection with development tools like Enforcer. You got an MMU protected development environment that flagged memory errors, leaks, etc. Coupled with this, the CATS folks ensured that all magazines got copies of Enforcer.

So by the later 1980s, it was a huge problem for a company to release a program that displayed 'enforcer hits'.And sure, a protected environment (and there were ways to do it) would have made the system more robust on MMU-based Amigas, though it's still far better to release bug-free code than to just count on memory protection. Windows PCs still get BSODs, after all. And this helped quite a bit, in those days, to keep non-MMU machines reliable as well. I had my Amigas runinng for months at a time in those days when NOT developing code.My goodness, I had completely forgotten about Enforcer. I feel old now. Any unix system had far better multitasking. (for that matter, Apple had it on the Lisa).

The point IS, the multitasking was unreliable because there was no memory protection.Actually, not true. Most versions of UNIX in those days did not have 'better' multitasking, for the simple fact that it didn't have multithreading. So sure, you could fork, but that didn't promote multitasking as basically just another useful programming construct.

And UNIX multitasking was pretty clunky in those days, too. It didn't really get good until Linux came along and fixed a bunch of things for multiprocessing that also improved latency.Multitasking was not unreliable do to a lack of memory protection.

It pretty much worked. Buggy applications just weren't tolerated, and as mentioned elsewhere, the development team was proactive at delivering developer tools to find these sort of bugs. Millions of people did amazing things on Amigas, and the multitasking was a huge win. It's pointless to view it through today's software lens, because that's not what it was in competition with back then. Many posts here have pointed out various technical issues with the Amiga design, as if this caused Commodore's demise or meant that the Amiga never had a chance of succeeding.

However, this isn't the case, for a number of reasons:Firstly, the main Amiga models, especially the A500 and A1200 did succeed in Europe. With decent marketing and reliable supplies of hardware they outsold Apple and Atari, and held their own against the PC.Secondly, Commodore didn't have to beat Microsoft and Intel, they only had to beat Apple. If the Amiga had been the last survivor of the 80s systems then Microsoft would have been propping up Commodore after the antitrust action started.Lastly, every system ever released has compromises, deficiencies, and areas that become outdated. None of the issues with the Amiga were insurmountable. A healthy, well-managed Commodore could have overcome them:CPU - Motorola stopped the 680x0 line, but Commodore never exhausted that line anyway. They never shipped an 060 machine at all, and they never put an 030 or higher in a low-cost model.

Yes, these processors were expensive, but they'd have been a whole lot less expensive if a healthy Commodore was been buying a million units a year. That might even have persuaded Motorola to continue the line and develop a Pentium-like RISC core emulating the 68k instruction set.Custom chips - Someone suggested that specialised hardware naturally migrates to the CPU and eventually everything will end up on the CPU.

This is a very Intel/x86 viewpoint. The PC industry has gone this way because it suits Intel, and they are the dominant player. It's not the only way though. If you look at the console market, it has always been much less CPU-centric.

A healthy Commodore could have designed 3d chips or other custom hardware. They could even have done something like buying 3dfx if necessary.None of this particularly relevant for Commodore in 1994 though. All they need to do is ship an 040 machine with the AAA chipset and a built-in CD. That gets them through until the next generation.Of course, a healthy Commodore should probably have shipped the A1200 with a built-in CD, in 1990!OS - The Amiga OS lacked memory protection, virtual memory, and multi-user support.

As Dave pointed out Commodore already had Enforcer. Also, the tiny companies and individual developers who have been developing the OS since Commodore's demise have implemented memory protection and virtual memory. I'm sure Commodore could have managed that.Multi-user support would have been more difficult, but it also doesn't really become an issue until the 2000s when security starts to become a major issue. A new multi-user filesystem could easily be added - I believe Commodore already had one, but I've never actually seen it.

Things like user owned processes would have been more difficult though. However, in 1994, the Amiga OS was in a better state than Windows or MacOS.

Commodore might have had to replace it eventually, but not in the 90s. Many posts here have pointed out various technical issues with the Amiga design, as if this caused Commodore's demise or meant that the Amiga never had a chance of succeeding. However, this isn't the case, for a number of reasons:Firstly, the main Amiga models, especially the A500 and A1200 did succeed in Europe.

With decent marketing and reliable supplies of hardware they outsold Apple and Atari, and held their own against the PC.IMO, that was the last gasp of the design proliferation era, as we firmly moved into the era of standardization.That success was also relative. Over nearly a decade, they still only sold about 5 Million Amigas worldwide, most of them computer in keyboard machines like the A-500. The computer in the keyboard was another 80's throwback.Amiga mainly sold on ONE feature to most people: Gaming. Amiga were mainly viewed as gaming computers, which made them hard to break through into other areas.

They were also a gaming computer in the era of resurging consoles. Amigas, 5 million over it's life. PS2 100 Million. The Hombre project planned to use an HP PA RISC processor and a 3D graphics accelerator card to deliver a truly next-generation Amiga.Now that's desperation.

I doubt they'd get very far using an expensive, boutique server CPU that even HP signed the death warrant on a scant few years later. (Not that it's replacement - Itanium - set the world aflame) The SGI of the 1990s actually understood what animators needed, Commodore didn't have a clue.The Hombre was designed as a two chip set. The PA-RISC was the instruction set, but Dr. Ed Hepler designed his own implementation, including the 3D instructions.

This was full-time chunky pixels, supporting four playfields in 16-bit color. The two-chip set was intended to work for a game system, but would have been a dedicated GPU subsystem for desktop Amigas. A whole PA-RISC with 3D hardware acceleration would have made a dandy OpenGL engine in those days. Commodore was not trying to compete with SGI anymore than we were with Sun or other workstation companies in those days.Dave, I read that the Hombre was intended to run Windows NT.

Is that correct?And if so why? I appreciate that NT had a lot of functionality that AmigaOS did not, but none of it seems particularly vital for a games system. I guess porting AmigaOS was not deemed feasible within the timescale, but presumably there were other options than NT?

Upgrading the CPU was nearly impossible for most A500 owners. Even if it was viable, it would and DID break compatibility with most early games and applications. Also the fact is, and it was frustrating as hell to me, that lazy ports of games from the PC where all of the burden was placed on the CPU were painfully slow when ported to the 7MHz 68000 based Amiga without offloading to the custom chips the way native Amiga games did.

Ultima 6 was one of the biggest examples of this by 1990 and probably the straw the broke the camel's back for me.It may have gotten better in the final years, but sadly many of us Amigaphiles (in North America anyway) gave up around 1990-91, driven to distraction by lazy ports.I don't remember the Lazy ports, but more a lack of ports, it was obvious that PC clones were going to dominate and the games market was shifting to them. I remember seeing 'Star Control II' and Wolfenstein 3d on a PC in 92. Great games not coming to the Amiga. It wasn't PC HW superiority as much as it was market share.When I decided to get a new computer 93, there wasn't even a slight thought of getting another Amiga.

I put together my first PC, a 486 clone. Almost all my Amiga friends bailed for a PC in 91-93 as well. Two points to raise.The Amiga 600 was a good example.

It was meant to be A500CR (CR was 'cost reduced'), but while it was meant to cost $50-60 less than the A500+ to produce, it ended up being that much more expensive!According to David Pleasance (MD of CBM UK, one of the few competent members of Commodore management), considered the A600 to be 'a complete and utter screw-up', a refresh of a six year old machine, at a lower profit margin, at the same price point, without any upgrade whatsoever and, indeed, a downgrade in terms of the keyboard and expandibility.Secondly, the AGA chipset. We never saw it in its full form. Lisa, the RAMDAC, compositor and RAM controller was butchered down immensely.

Lisa was meant to have built-in Amber (scandoubling and flicker-fixing) while Alice, the blitter, copper display processor and RAM controller was meant to have a full 32 bit chip memory bus. Alice didn't. There was also meant to be a hardware chunkyplanar converter in Alice with enhanced I/O, including an IDE port which didn't suck.

This became Akiko and only ever seen in the CD32. Akiko's enhanced I/O was never enabled on any production model.Right, the problem with Commodore was, they got lucky being at the right place at the right time, and having a CEO (Jack) who knew how to produce products in volume and cut prices mercilessly.Commodore buying Mos Technology got them the 6502, and enough engineering talent to create a home computer early, with the 1977 PET (6502, custom video controller).

But infighting with Jack drove those engineers away from the company within a couple years, which was why it was several years later when they released a new product, the 1981 VIC 20.The VIC 20 used an ancient video chipset that had been for sale from MOS Technologies for several years. They used that to make a five-chip cost-reduced 3KB Color PET, and just brand it the VIC 20. The fact that it took them THAT LONG to see what they had clearly shows the lack of vision for Jack.The board if executives was so happy with the VIC 20's million sales, they never tried to move technology forward, fearing it would hurt sales. No 16-bit replacement for the 6502, no replacement for the crappy VIC chip.

The engineers got bored, and went to Jack directly, and that tiny team created the VIC II and SID chip for the Commodore 64, and 'surprised' management.Then after Jack left, they were handed yet-another gift, with the Amiga. They never actually had the engineering talent to improve upon it, as seen by the late, broken AGA release.THE REAL REASON Commodore was always doomed was because of the mindset Jack instilled in everyone in the company: BUSINESS IS WAR! This meant that your average anti-social engineer would hide at their desk, or leave, rather than confront Jack and make his presence felt. And his executives learned how to stab each other in the back from the very best.If you're always at-war, how do you foster cooperation, retention and long-term engineering goals?Once he had put his roots in the company and left that war paradigm, only a culling would save the company form ruin. It's just a shame they took such good tech with them.

IF you want to read more about Jack's venom, just have a look at this book.The Kindle edition only covers up till Jack leaves. The hardcover gives you every detail of the Amiga years.

Many posts here have pointed out various technical issues with the Amiga design, as if this caused Commodore's demise or meant that the Amiga never had a chance of succeeding. However, this isn't the case, for a number of reasons:Firstly, the main Amiga models, especially the A500 and A1200 did succeed in Europe. With decent marketing and reliable supplies of hardware they outsold Apple and Atari, and held their own against the PC.IMO, that was the last gasp of the design proliferation era, as we firmly moved into the era of standardization.That success was also relative. Over nearly a decade, they still only sold about 5 Million Amigas worldwide, most of them computer in keyboard machines like the A-500. The computer in the keyboard was another 80's throwback.Amiga mainly sold on ONE feature to most people: Gaming.

Amiga were mainly viewed as gaming computers, which made them hard to break through into other areas. They were also a gaming computer in the era of resurging consoles.

Amigas, 5 million over it's life. PS2 100 Million.They may have sold as that, but that was another marketing failure. There were some great productivity programs on the Amiga (Final Writer ran f. circles around Wordperfect or even anything on Macs. Even running slow on a base A500 it was better than using a PC or Mac. The rest of the Final suite varied, but they were all competitive. As earlier said the Amiga OWNED video editing and was very strong in music as well.

It could easily have managed to make the transition into a productivity machine. That one is also entirely on Commodore leadership. excepting disk I/O, where Commodore hatcheted the 1541. floppy drive. For anything that didn't tie up the CPU, the 1541 was brutally slow. The 1541 had to be further slowed down for VIC-20 compatibility.I'm pretty sure it's the other way around. The 1540 was the original, for the Vic-20 and the 1541 is speed reduced for C64 compatibility.

Both end up using a software-driven polling loop due to different hardware bugs but the Vic-20 lets the CPU run at a constant speed, always taking the same number of equally-spaced cycles for video refresh. Conversely the C64 has a 'bad line' every eight, when it loads the next row of tile indices. So the C64's CPU runs with uneven timing, every so often having a bit of a pause, which just so happens to be enough of a pause to miss data from the serial bus. So the 1541 is the 1540 with a ROM update that makes it supply data just slightly more slowly, such that the C64's bad lines aren't sufficiently enough of a pause to cause it potentially to miss anything. Custom chips - Someone suggested that specialised hardware naturally migrates to the CPU and eventually everything will end up on the CPU. This is a very Intel/x86 viewpoint.

The PC industry has gone this way because it suits Intel, and they are the dominant player. INo, it just suits physics.Transistors inside the chip get cheaper and lower power.External pins, board traces and connectors, not so much.Hence, more and more gets integrated in the same chip.The PC is hardly the only or most radical example of such integration.And neither is intel, who lagged many years to integrate the memory controller, after AMD did it for PCs. Custom chips - Someone suggested that specialised hardware naturally migrates to the CPU and eventually everything will end up on the CPU. This is a very Intel/x86 viewpoint. The PC industry has gone this way because it suits Intel, and they are the dominant player. INo, it just suits physics.Transistors inside the chip get cheaper and lower power.External pins, board traces and connectors, not so much.Hence, more and more gets integrated in the same chip.The PC is hardly the only or most radical example of such integration.And neither is intel, who lagged many years to integrate the memory controller, after AMD did it for PCs.You may be right as a long term trend, but as I said, this didn't matter for Commodore in the 90s.

The industry was heading the other way at that time - sound, 2d and 3d graphics cards were on the rise. A non-cpu-centric system was perfectly viable then, and it still is even now. Love these articles! Thumbs up.I missed the Amiga boat. Grew up on C64 and I can credit my computer career to the C64. How my mother managed to scrape up the $400 or whatever the C64 cost in 1983 I have no idea. Then she got to do it again.

After we were burglarized and I begged nonstop to get a replacement C64 a year later.I wanted the Amiga, so badly. But no one I knew could afford it. By the time I was working at 15+ I was trying to save up for one. But life got in the way. Girls, cars, rent, etc etc. By the time I had disposable income (disposable income for me in 1992 meant I wouldn't starve to death, but I would be eating a lot of ramen or mac n cheese, and never going out and having fun) the deed was done.

But I saw the Commodore death spiral writing on the wall long before that.But now. I mean I recently dropped $400 on a GTX 1070 video card, just so I could play the same games I already was playing BUT SLIGHTLY FASTER. Someone could release a super powered Amiga now and if it was under $1k I'd think about it. But it better do 4K video streaming, LOL.

I mean if the new Asus Raspberry clone can do it for $50.I've played with Amiga Forever and it's OK but without the hardware it's just not the same. Also, all the games look worse now than stuff I played in the 90's on PCs. It's hard to go back in time.

Also, it only survived because of the deals Apple made with Microsoft in the mid 90's, which gave Apple some much needed cash, and put the Mac in a unique position compared to other alternative platforms. They basically piggybacked on Microsoft's dominance. Later, they were able to keep the Mac on life support thanks to the iPod and later iPhone. The Mac platform on its own probably would not have survived.That's a very good point that the original article misses. The survival of Apple was artificial and only came about because of the antitrust action against MS.It's a very good story, but it's wrong.The $150M from Microsoft, along with a pledge to develop Microsoft Office for Mac for five years, came as a settlement against Microsoft.Also, cash was not the problem at Apple; they had a bank account of about $2B at the time.What the Microsoft deal was most effective in was to change the public perception that Apple was going down the drain (which it may have eventually, but not nearly as quickly as you might have expected). I do agree with much you say. However the 68000 line could easily have destroyed x86 if motorola did a better job of it.

The 68040 was much much quicker per clock cycle than the 486 but was on the market late and had some thermal issues which prevented it from being overclocked. If they had not had those two issues with it and commodore had not killed off the internal projects then we probably could have seen a much more powerful A1200. It is also possible that the main amiga line could have moved to powerpc if it had survived a few more years.Could have, but didn't. The thermal problems were a design problem.Motorola also missed the boat with the lack of an FPU. Custom chips - Someone suggested that specialised hardware naturally migrates to the CPU and eventually everything will end up on the CPU.

This is a very Intel/x86 viewpoint. The PC industry has gone this way because it suits Intel, and they are the dominant player. It's not the only way though. If you look at the console market, it has always been much less CPU-centric.The current Sony and Microsoft consoles have the GPU and CPU in the same die (AMD APU), so i'm not sure where you're going with the console analogy (you are mistaken). You're just reinforcing the point that yes, eventually stuff ends up in the CPU.The AppleTV, iPhone, iPad and all the Android devices are also SOCs that integrate both CPU and graphics in a single die.Discrete GPUs are a rarity outside of enthusiast PC space, the vast, vast majority of the market has migrated to integrated graphics.

I don't think he is overestimating the performance hit. X86 sucks, big time.After the release of the Pentium Pro, x86 has been competitive with, and eventually outperforming everything else. It's the ultimate proof that the instruction set doesn't matter.Further to that, CISC instructions can do more per instruction which can actually improve cache utilisation and reduce bandwidth to main memory.CPU speeds have increased far faster than memory speeds.Provided it does what you want it to do. With CISC, high level functions are hard coded into the dice. Have a bug in your CISC instructions and you are stuck with it forever.

Have it in the compiler and an uprgade will fix it. Bugs in CISC did and do happen. Pentium had bugs, because of its complexity. Here is an example:CISC adds a lot of complexity in the dice.Intel's commercial policy doesn't help things, with the Celeron fiasco. They purposely cripple their processors in order to increase their profits with complex offering of different packages for different price ranges. I don't think he is overestimating the performance hit.

X86 sucks, big time.After the release of the Pentium Pro, x86 has been competitive with, and eventually outperforming everything else. It's the ultimate proof that the instruction set doesn't matter.Further to that, CISC instructions can do more per instruction which can actually improve cache utilisation and reduce bandwidth to main memory.CPU speeds have increased far faster than memory speeds.Provided it does what you want it to do. With CISC, high level functions are hard coded into the dice. Have a bug in your CISC instructions and you are stuck with it forever. Have it in the compiler and an uprgade will fix it. Bugs in CISC did and do happen.

Source

Pentium had bugs, because of its complexity. Here is an example:CISC adds a lot of complexity in the dice.You clearly haven't heard of microcode updates. The FDIV bug was what. 23 years ago?RISC didn't stay RISC anyway. CISC went RISC internally (with a CISC x86 decoder on the front end only - so internally x64 gets the benefits of being RISC internally with the code density of CISC in RAM and instruction cache), and RISC became more CISC.

Altivec, etc. On PPC for example are not RISC. I do agree with much you say.

However the 68000 line could easily have destroyed x86 if motorola did a better job of it. The 68040 was much much quicker per clock cycle than the 486 but was on the market late and had some thermal issues which prevented it from being overclocked. If they had not had those two issues with it and commodore had not killed off the internal projects then we probably could have seen a much more powerful A1200. It is also possible that the main amiga line could have moved to powerpc if it had survived a few more years.Could have, but didn't.

The thermal problems were a design problem.Motorola also missed the boat with the lack of an FPU. I don't think he is overestimating the performance hit. X86 sucks, big time.After the release of the Pentium Pro, x86 has been competitive with, and eventually outperforming everything else. It's the ultimate proof that the instruction set doesn't matter.Further to that, CISC instructions can do more per instruction which can actually improve cache utilisation and reduce bandwidth to main memory.CPU speeds have increased far faster than memory speeds.Provided it does what you want it to do.

With CISC, high level functions are hard coded into the dice. Have a bug in your CISC instructions and you are stuck with it forever. Have it in the compiler and an uprgade will fix it. Bugs in CISC did and do happen. Pentium had bugs, because of its complexity.

Here is an example:CISC adds a lot of complexity in the dice.You clearly haven't heard of microcode updates.RISC didn't stay RISC anyway. CISC went RISC internally (with a CISC x86 decoder on the front end only - so internally x64 gets the benefits of being RISC internally with the code density of CISC in RAM and instruction cache), and RISC became more CISC. Altivec, etc.

On PPC for example are not RISC.Didn't work with the Pentium. After the Pentium, Intel introduced the Celeron abomination, which was a purposely crippled Pentium designed to perform bad in order to increase their profits. I do agree with much you say.

However the 68000 line could easily have destroyed x86 if motorola did a better job of it. The 68040 was much much quicker per clock cycle than the 486 but was on the market late and had some thermal issues which prevented it from being overclocked. If they had not had those two issues with it and commodore had not killed off the internal projects then we probably could have seen a much more powerful A1200. It is also possible that the main amiga line could have moved to powerpc if it had survived a few more years.Could have, but didn't. The thermal problems were a design problem.Motorola also missed the boat with the lack of an FPU.

Speed download for mac. Folx GO+ is the perfect tool for neat and easy management and organization of your downloads. A friendly download manager with an impressive set of options and a sleek Mac interface; Splitting the downloads in up to 10 threads, which helps with downloading speed. Folx is a free download manager for Mac OS X with a true Mac-style interface. It offers convenient downloads managing, flexible settings, etc. Folx has a unique system of sorting and keeping the downloaded content. Torrent client with built-in torrent search is back in the new version of Folx. Folx has a modern interface with Retina displays.

I do agree with much you say. However the 68000 line could easily have destroyed x86 if motorola did a better job of it. The 68040 was much much quicker per clock cycle than the 486 but was on the market late and had some thermal issues which prevented it from being overclocked. If they had not had those two issues with it and commodore had not killed off the internal projects then we probably could have seen a much more powerful A1200. It is also possible that the main amiga line could have moved to powerpc if it had survived a few more years.The Amiga was superior to 386/VGA in many ways. No HAM mode on VGA for instanceCould have, but didn't. The thermal problems were a design problem.Motorola also missed the boat with the lack of an FPU.

Have a bug in your CISC instructions and you are stuck with it forever. Have it in the compiler and an uprgade will fix it. Bugs in CISC did and do happen.Replace CISC with RISC and the same is still true.

Try reading some CPU errata sheets sometimes. Even in the relatively tiny ARM Cortex-M cores there's some forbidden instruction sequences that will cause the CPU to malfunction. You're also falling into the trap of thinking that RISC somehow means small or simple.

A RISC CPU will have similar complexity of a CISC cpu with the same performance. IBM's POWER, MIPS R10000, Alpha etc. I do agree with much you say.

However the 68000 line could easily have destroyed x86 if motorola did a better job of it. The 68040 was much much quicker per clock cycle than the 486 but was on the market late and had some thermal issues which prevented it from being overclocked. If they had not had those two issues with it and commodore had not killed off the internal projects then we probably could have seen a much more powerful A1200.

It is also possible that the main amiga line could have moved to powerpc if it had survived a few more years.Could have, but didn't. The thermal problems were a design problem.Motorola also missed the boat with the lack of an FPU. I do agree with much you say. However the 68000 line could easily have destroyed x86 if motorola did a better job of it.

The 68040 was much much quicker per clock cycle than the 486 but was on the market late and had some thermal issues which prevented it from being overclocked. If they had not had those two issues with it and commodore had not killed off the internal projects then we probably could have seen a much more powerful A1200. It is also possible that the main amiga line could have moved to powerpc if it had survived a few more years.Could have, but didn't. The thermal problems were a design problem.Motorola also missed the boat with the lack of an FPU. Custom chips - Someone suggested that specialised hardware naturally migrates to the CPU and eventually everything will end up on the CPU. This is a very Intel/x86 viewpoint.

The PC industry has gone this way because it suits Intel, and they are the dominant player. It's not the only way though. If you look at the console market, it has always been much less CPU-centric.The current Sony and Microsoft consoles have the GPU and CPU in the same die (AMD APU), so i'm not sure where you're going with the console analogy (you are mistaken).

Open AdBlock is a free and open source content blocker for iOS that has been written from the ground up to support both 32- and 64-bit devices.

How does it work?

As you may be aware, Apple blocked the ability to load content blockers for 32-bit devices (those with a A6 chip or older) in a late developer beta of iOS 9. Open AdBlock circumvents this restriction by redirecting Apple's loading checks and making them succeed, using Facebook's excellent fishhook library. Once the verification is bypassed, Open AdBlock simply uses private SafariServices APIs to load and enable the extension. Unfortunately, this makes it impossible for Open AdBlock to be available on the App Store–if you'd like to try it out, you'll have to build it youself.

Contributors

A shoutout to those who helped work on Open AdBlock, back when it was released:

  • Justin Leger (@jusleg)
  • Martin Turek (@MrSp0ck)
  • Jeremy Gillespie (@jeremyskateboard)
  • Nicolas Da Mutten (@cleverer)
  • Jason Piper (@Xaositek)

TODO: update installation steps

How to install the app (works for iPhone 5 and 5c!)

Since we haven't released OpenAdBlock to the App Store yet, here's how to install the app at the moment. Keep in mind you will have to make a 3.6 GB download and have about 8GB of free space on your HD.

  1. Download Xcode 7 from here.
  2. Install it (and the Developer Tools).
  3. Download the .zip from this repo
  4. Open the .zip file and open the OpenAdBlock folder, then open OpenAdBlock.xcodeproj. This will open it in Xcode.
  5. Go to Xcode > Preferences. Open the Accounts tab and add your Apple ID (the one you used to sign to be a developer). Close Preferences.
  6. On the left side there will be a sidebar with one file called OpenAdBlock. Select it and click on the arrow to the left of it.
  7. This will cause a bunch of stuff like Identity, Version, etc. to appear in the centre. Select the Team: Unknown Name [Some garbage here] and change it to your account from the drop down menu. Afterwards click on the “Fix Issue” button.
  8. Above it, you will find the Bundle Identifier text field. Change this text to contain your name/nickname instead of “saagarjha” (e.g. if your name is Bob Joe, change it to bobjoe.[whatever was there originally]; therefore making it look like this –> bobjoe.org.OpenAdBlock.OpenAdBlock)
  9. In the centre screen on the top left (right above the identity field), there will be a dropdown menu with the app icon and name. Click it and select the Extension build target (It has an 'E' next to it) and repeat step 8 (!! Remember to only change the beginning of the text !!) (example –> bobjoe.org.OpenAdBlock.OpenAdBlock.OpenAdBlockextension)
  10. Plug in your iPhone, and click the iPhone 6 next to the stop button on the top and select your iOS device (not the model, the name of your device).
  11. Click the play button, hit “fix issue” when it pops up, unlock your iPhone.
  12. On your iPhone go to Setting > General > Profiles. Scroll down and click on a field with you Apple ID email address and hit 'Trust'
  13. Now the app is installed on your iPhone, run it and enjoy :)