My first job was at Imagination Technologies in 2006, working on the first SGX chips that shipped in the iPhone 3GS and the first iPad. I did not last long.
At that time the folks at Apple would send us mockup UI apps with placeholder textures, just to test the performance on the chip. They told us they were building the next generation of the Apple TV -- only much later we learned we had been working on the first iPad.
Working there was a pretty surreal experience. We were understaffed and underpaid. To a first approximation the teams consisted of a core group of British engineers alongside with a revolving door of young European immigrants like myself.
Every Friday morning we would be offered free doughnuts in the canteen, and since the nicest ones were in short supply, some people would arrive early and wait. The coffee came from a vending machine and was terrible, though.
There were some heroic efforts to ship stuff on time, especially on the hardware side of things. I have mad respect for the people who worked there under those circumstances.
To be fair, the company was hemorrhaging money at the time. The main "office" building, where I worked, was actually a run down warehouse in the middle of nowhere. Compare the following two pictures, from 2009 [1] and from 2017 [2].
>The main "office" building, where I worked, was actually a run down warehouse in the middle of nowhere.
That's nothing, I used to work for one of Europe's biggest semi companies and some of our offices were made of shipping containers lol.
IIRC even the mighty Intel had to completely overhaul their offices in 200? after some Jay Lenno visit embarrassed them for looking like the '80's.
Why is it that SW companies pride themselves with nice offices while semi/HW companies have some run down trailer park offices and see no issue with that?
> Why is it that SW companies pride themselves with nice offices while semi/HW companies have some run down trailer park offices and see no issue with that?
I haven't noticed that problem in North America: Qualcomm, AMD and NVidia all have nice campuses.
It's pretty standard in Europe semi/HW sector as it's no SV, so there's no massive cluster where people can quickly find alternate employment at a competitor across the road if they get mistreated by an employer. In the EU semi space, changing jobs, often means changing country, which implies a lot of friction many people can't or don't want to deal with.
American ones too - there may have been an espresso bar, but the free option for us was a Starbucks “iCup” machine whose output tasted and smelled nasty.
Now that I think about it, maybe we should’ve checked the water supply.
I've seen nearly 30years of VL/IMG coffee, but post lockdown the kitchen coffee at IMG has come on leaps and bounds: built in grinder and freshly brewed per cup.
Has NOT helped my (failed) attempts to wean myself off coffee :-)
It's really nice to see Imagination's newly gained openness (see https://www.phoronix.com/scan.php?page=news_item&px=Open-Sou... for more context). The company's peak is behind it since Apple slowly dropped their IP. Despite being hurt by their policies during the poulsbo days, I still hope that this isn't a swan song and that it will help their business. Nobody wins with a heavily concentrated GPU market.
>The company's peak is behind it since Apple slowly dropped their IP.
They never did. They are still using IMG's IP even today. Just look at their PVRTC support. It was possibly one of the biggest lie ever and caused their stock to collapse. Now they are sold to a Chinese Capital and are pushing towards competing in PC GPU market.
I think it depends on what you mean by IP and it's unfortunate that the hardware space overloads those terms. My understanding is that they aren't using any IMG hardware IP blocks, and are simply licensing a couple ancient patents.
There are large similarities from Apple hiring large portions of the dev staff though. Apparently IMG capped it's tech staff at £75k/yr total comp and so Apple just opened it's own office across the street.
>I think it depends on what you mean by IP, it's unfortunate that the hardware space overloads those terms.
It really doesn't. IP is very specific.
If you are an architecture licensee of ARM, are you using / or paying for ARM's IP?
If you are designing your 3G modem, using CDMA, ( Whether it is WCDMA, CDMA 2000 or TDS-CDMA ) without using Qualcomm Modem or anything Qualcomm designed hardware blocks, are you using Qualcomm's IP?
We are argue whether Apple needs to pay for Tile based deferred rendering ( You can search the term mentioned on Apple's Dev Reference document ) to IMG, since relevant IP could be obtained with ARM. ( Arguably with cross patent argument with IMG, but anyway let's ignore that ). How are you using technology from IMG PowerVR, even with its name on it, and consider yourself not using any IP from IMG, when the patent itself has not expired. Quote.
"It will no longer use the Group's intellectual property in its new products in 15 months to two years time."
And I could go on about the private equity having its relationship with Apple's supply chain. But I will save politics away from this discussions.
> If you are an architecture licensee of ARM, are you using / or paying for ARM's IP?
> If you are designing your 3G modem, using CDMA, ( Whether it is WCDMA, CDMA 2000 or TDS-CDMA ) without using Qualcomm Modem or anything Qualcomm designed hardware blocks, are you using Qualcomm's IP?
The hardware space overloads the term to mean RTL or lower abstract representations of circuit layouts.
> We are argue whether Apple needs to pay for Tile based deferred rendering ( You can search the term mentioned on Apple's Dev Reference document ) to IMG, since relevant IP could be obtained with ARM. ( Arguably with cross patent argument with IMG, but anyway let's ignore that ). How are you using technology from IMG PowerVR, even with its name on it, and consider yourself not using any IP from IMG, when the patent itself has not expired. Quote.
> "It will no longer use the Group's intellectual property in its new products in 15 months to two years time."
There's not really an argument on if anyone has to pay anyone for TBDR; they don't. This comment thread is on the software for a chip that was a TBDR and is older than max patent length. The concept is in the public domain now. Since this chip came out in 1995, TBDRs have been fair game for nearly seven years now.
>This comment thread is on the software for a chip
I was replying specifically about IP. And the context of Apple not using IMG's IP. Which was "never" the case. As pointed out by IMG’s CEO in 2019 before the 2020 IP renewal.
You can make a new implementations of TBDR renderers without paying anyone.
> I was replying specifically about IP. And the context of Apple not using IMG's IP. Which was "never" the case. As pointed out by IMG’s CEO in 2019 before the 2020 IP renewal.
The point is that the patents (ie. the IP that would matter in this case) have expired for TBDR, despite your claim "We are argue whether Apple needs to pay for Tile based deferred rendering ( You can search the term mentioned on Apple's Dev Reference document ) to IMG, since relevant IP could be obtained with ARM. ( Arguably with cross patent argument with IMG, but anyway let's ignore that )." No cross patent agreements are necessary.
There was some recent 'reverse engineering' of the recent Apple products, mentioned on the web a few months back, particularly how they do SIMD branching/divergence. It sounded remarkably familiar to the PVR SGX/Rogue approach.
Hum, thanks for the headsup, it was my understanding that the Apple "GPU" compute part was mostly built in-house, and that only the fixed-function part were from Imagination at some point, then dropped. I didn't know they were still using it. Would you have more details ?
I think you have all of this right. Parts of IMG IP Apple still uses is the TBDR stuff, compression and some other fixed-function things. It is unclear how much of it Apple has modified. The actual compute and scheduling engine is Apple's own.
I had one of these around 1997. It was pretty fast and affordable, but had some major holes in its feature set. I remember it didn’t support OpenGL style custom blending, so a game that rendered explosions using an additive-blended sphere would just display an opaque expanding sphere.
Games that were specifically tuned for its feature set could look great though. These cards had sharper VGA output than the 3dfx Voodoo which required an analog passthrough from a separate 2D graphics card (the Voodoo was a pure 3D accelerator).
I thought about getting a 3dfx Voodoo but then realized I don’t really play 3D games anyway. The PowerVR clocked in maybe four hours of total use by me, and nobody wanted the card because of its reputation.
Ah that makes me wonder if my father’s PC had a similar GPU.
I played Half-life at the time, and shortly after the tentacle boss, you’d had to swim underwater. However this water was just rendered in a opaque color, so at that point I could not finish the game. Not an idea where to go.
Series 1, eg, PCX1 and PCX2 had (SRC_Alpha, 1 - SRC_Alpha) blending, but not the full set of OpenGL options.
When series 2 was released (Eg, ARC1, Dreamcast, Neon250), it had the full set including support for destination alpha, which IIRC wasn't available on eg 3dfx.
This is exciting! If they open source Power VR 2 then would help enable deeper understanding of the Sega Dreamcast's GPU right?
If I recall correctly the SH-4 CPU arch is already open. Just imagine documenting the console completely at a deep level!
Wonder if it might enable some now innovative ideas. We see lots of those open source "retro" consoles popping up that rely on old now open and documented 8-bit CPUs. Maybe this could allow the creation of some "Retro" 3d consoles?
I really like the fact that they are open sourcing stuff recently !
I'm helping to make linux distros run on chromebooks and mine(the acer chromebook R13) is using a powerVR GPU (https://github.com/hexdump0815/linux-mainline-mediatek-mt81x...) , I struggled a lot trying to use their drivers.
I hope I will be able to use their drivers soon enough !
The code [1] includes an implementation of the FastInvSqrt that date back at least to 1996. There's an assembly and a C version, referred as SLOW_SQRT_DIV :).
It uses float/int pruning through union (in C) and one or two rounds of Newton-Raphson refinements.
Interestingly corrections aren't done with a fixed value (part of the famous 0x5F3759DF) but through a LUT [2], which I guess, yield better results but was a bit slower on 90's arch.
> As of right now, due to licensing concerns we have been unable to supply some libraries and headers provided by SciTech Software for "The Universal VESA VBE" However, this was only used for the Tomb raider port, in order to tell the PCX hardware the details of the framebuffer.
Fair enough - who can we contact to try and get UniVBE open sourced? There's a lot of early graphic card info locked in there.
"The first series of PowerVR cards was mostly designed as 3D-only accelerator boards that would use the main 2D video card's memory as framebuffer over PCI."
There's some mentions of (classic) Mac and RAVE support in the code, but I don't remember any mention of PowerVR cards, games, drivers, etc. on the Mac back in the day. Anyone else?
Diamond Multimedia got the hardware working on Macs in 1996. The demos were just outstanding. Trying to make it reliable for real-world games, though, was tedious. Endless triangle mismatches. Nobody was sure how much additional effort would be required to produce a reliable product for sale.
At the time, Diamond's desire for additional Mac revenue was dwindling, along with its share price, so leadership probably made the right call to cancel the project (along with all their future Mac products).
This is what the company pivoted to in the 90s when their VideoLogic video overlay card business went out of fashion. They have been adept at changing course multiple times.
Fair enough. I've certainly used that in other code (eg wrappers around mallocs with checks etc) but this was specifically for that ".c" file so it didn't really matter.
(Also, not sure we could assume the code optimisation was all that great on the compilers back then :-) )
My first job was at Imagination Technologies in 2006, working on the first SGX chips that shipped in the iPhone 3GS and the first iPad. I did not last long.
At that time the folks at Apple would send us mockup UI apps with placeholder textures, just to test the performance on the chip. They told us they were building the next generation of the Apple TV -- only much later we learned we had been working on the first iPad.
Working there was a pretty surreal experience. We were understaffed and underpaid. To a first approximation the teams consisted of a core group of British engineers alongside with a revolving door of young European immigrants like myself.
Every Friday morning we would be offered free doughnuts in the canteen, and since the nicest ones were in short supply, some people would arrive early and wait. The coffee came from a vending machine and was terrible, though.
There were some heroic efforts to ship stuff on time, especially on the hardware side of things. I have mad respect for the people who worked there under those circumstances.