ArchiTecnologia

Aprendizaje abierto. Conocimiento libre.

ArchiTecnologia
ElectrónicaEntrevistaHardwareRISC-V

Colin Riley: exclusive interview for AT

New exclusive interview with Colin Riley. If you don’t know it yet, he gets the BSc at the University of Glasgow, and after working on the creation of drivers for the GPU, debugging and compilers, currently working at AMD (Radeon) for over four years like a engineer. Thank to him, your games run fast!

He is also a fan of electronics, FPGAs, and RISC-V. If you want to know more about him and his hobbies, I invite you to continue reading…

Please, IT’S IMPORTANT! Colin is fundraising for Edinburgh children’s hospital charity. You can also donate to other organizations in the fight against cancer. You can learn more on Colin’s personal blog. Your help will save live!!!

All answers are personal and don’t reflect views of past or present employers.

Architecnología: I always ask at the beginning: Who is Colin Riley?

Colin Riley: I’m Colin, a software engineer currently focusing in low-level graphics. I live in Scotland, and since an early age always wanted to work in computer technology. Throughout my career I have worked with fun and unusual hardware – like the original PhysX PPU accelerator and IBM CellBE CPU. For over 10 years after leaving University where I studied Software Engineering, I worked for a company in Edinburgh, Scotland, writing compilers, runtimes and debuggers – with some game technology development thrown in, mostly on PlayStation 3. Since 2016 I have worked from home on GPU gaming drivers, and most recently, gaming technology – making games run and look great on PC. Whilst remote working has become a required necessity for most this year, it was already my reality and is something I really value. I have three kids and being around our youngest whilst working from home has made me realise how much I missed with the other two growing up. I don’t think I could work predominantly from an office environment again; I’m definitely one of the people whose productivity increased after a move to remote working. For hobbies I enjoy dabbling in electronics, gaming (particularly competitive first person shooters), and have a blog where I document my projects. My projects have gone from controlling a real Z80 CPU with a Teensy microcontroller, to getting full-size PCIe slots on a Raspberry Pi, to my latest – which is creating a compliant, educational RISC-V CPU for FPGAs.

AT: When and how did you start being passionate about technology?

CR: As a child I was always taking things apart. Walkmans, VCRs, even TVs. I was interested in how these little components used electricity to come alive with sound and video. Despite this, I was totally oblivious to how electronics worked – until I got a 200-in-1 electronics kit. Despite having opened ZX Spectrums and Commodore64’s from car boot/garage sales, I did not get into programming until my first family PC which was a 486 running Win95. I started with writing little batch file scripts and then levelled up to Visual Basic after my school computing teacher noticed I was very interested and allowed me to borrow the various install CDs for my home PC. I’ve been hooked on computer hardware and software ever since, but as I get older I feel pulled towards the hardware.

AT: Do you have a reference? Someone who has inspired you?

CR: Once I realised that games development was what I wanted to pursue, I became involved in the Quake 3 modding scene. The community was so active at the time and there were many folks to take inspiration from. There was always some new game idea being tried and really talented folks putting it together. I did level design, UI design, and eventually game coding in C. The friends I made from then went on to work in tech, and a few founded companies – like Splash Damage in the UK. I’m still friends with them today. Back when I was starting out and learning, the big programmers to take inspiration from were obviously John Carmack and Tim Sweeney – but I think for me it really was the modding community channels on Quakenet IRC and various forums that kept me going and wanting more. There was a feeling of mentorship within that group that was incredibly valuable.

AT: Why did RISC-V catch your attention?

CR: I was working at a company which did work with the LLVM compiler project at the time. I was on all of the development mailing lists, and noticed more messages for a RISC-V target implementation. I’d heard about the RISC-V project before but it was getting lots of praise and so I started looking into it more – that’s when I realised it was likely going to gain traction, as the compiler toolchains and ecosystem is a huge factor in wider adoption. As I was looking for an ISA to adopt in my hobby FPGA work, one with a good toolchain, it all came together and I read the specifications.

AT: Do you see a good growth opportunity for RISC-V now that NVIDIA has purchased ARM? Do you think it can become a popular ISA (PC/HPC/IoT…)?

CR: Even before recent events I thought it was on a good trajectory. There have been great projects based on the ISA in microcontroller/IoT for a while and just last week I saw SiFive pushing for PC style adoption. I’d love to see an affordable Raspberry Pi style RISC-V board, but I think the affordable aspect is still a while off yet. I hope I’m wrong on that though and we see something appear soon, as it would really open up RISC-V to more developers.

AT: I see that you have designed a CPU using VHDL. What has been the most complicated part of the process?

CR: My «Design a CPU in VHDL» articles have defined my interest in a convergence of hardware and software for the last 5 years or so. For me, as a software engineer, the hardest part was still trying to conceptualize the dataflow of the CPU in VHDL. Even as a low-level engineer, developing close to the metal, you can fail to grasp just how complex and timing critical everything is underneath. To top it off, I had never done any FPGA development previously. I chose VHDL over Verilog as my reading before starting compared Verilog to C and VHDL to Pascal/Ada – and having developed in C for over 10 years, I still would not consider myself an expert. So VHDL it was. The CPU I designed was called TPU – the Test Processing Unit, or Terrible Processing Unit. I’m sure professional CPU designers would choose the latter name. This was well before Google named their AI processor TPU, by the way. It was an incredibly simple, serial CPU which only ever has one of the traditional Fetch, Execute, Writeback pipeline stages active at one time. To get performance, these normally run in parallel, multiple streams of data pipelined to maximize throughput. TPU was aimed as a purely academic project; these are the steps required to build a CPU – performance will come later. Even with such a simple CPU, dataflow between all the various units in the design is still critical. It usually defines the timing which in turn affects the clock speed you can run your design at. Design timing is still something I struggle with, despite TPU now morphing into the more usable RISC-V RPU project. I feel it is a black art within the FPGA community, there really is not many easy to follow guides on how to fix timing issues in designs.

AT: What is your favorite FPGA? And why?

CR: Looking back, this one is easy. My favourite was the miniSpartan6+ by Scarab Hardware. Its not available now, but was one of the first FPGA kickstarter development boards of the time that had HDMI, memory and SD card integrated into the board. Being involved in computer graphics professionally, it was always a plan to eventually have some graphical output in my projects. There were boards with VGA but they could be expensive and it seemed odd to go down the analog VGA route knowing that HDMI (or, more correctly, DVI-D) could be output by these new FPGA boards. It’s the board that made me go from «I want to get into FPGAs one day» to «I have fun using FPGAs as a hobby». I have kept using Xilinx Spartan FPGAs in my projects, as they are very powerful yet have affordable dev boards available. Recently there has been interest in Lattice FPGAs – especially due to open source toolchain support – and I have a few boards including the TinyFPGA by Luke Valenty. There is now a huge variety of affordable boards available, from digilent ArtyS7 for Xilinx Spartan fans to Greg Davills’ OrangeCrab for Lattice ECP5 with wide toolchain support. I have tended to stick with Xilinx mainly due to one reason – my decision to use VHDL has had unintended consequences when it comes to toolchain support, and the open source initiatives and new HDLs tend to not have great VHDL options. I’m hopeful in the coming months and years this changes.

AT: What’s your favorite development tool, and why?

CR: Debuggers! I feel debuggers sometimes don’t get enough credit. I’ve spent a lot of time implementing debuggers for GPUs and DSPs, and was an early contributor to the LLVM projects’ LLDB – so I know how hard it can be to implement a really good one with useful features. A great debugger does not get in the way of the developer, and a bad debugger, well, lets not think about those! Most folks see debuggers as tools they use to find and fix bugs in code, but I find myself using them mostly to learn and associate myself with new complex codebases. When presented with a new project – one with a large amount of existing code I have not written – I jump in the debugger and start stepping through the execution path. It’s fast tracked project knowledge. I tend to find some hidden detail that sometimes contradict code comments, so saves a lot of time down the line. When I studied at university there was not really much time spent on how to effectively use debuggers. I really hope that has changed in software education today. It’s not a required tool for development, but increases my productivity exponentially.

AT: As Game Engineer… What has been the biggest challenge you have faced throughout your career?

CR: Gaming is always pushing hardware forward, It’s a phenomenon which has had huge and sometimes unseen implications for technology. Not many people can look at a game and truly comprehend the amount of work and technical innovation that goes into it. From coping with delayed and constrained network communications, to asset building on server farms, to the ultra-low latency sensor feedback in Virtual Reality – the technology of the games industry has filtered into high frequency market trading, the cloud and medical fields. The hardware of games, traditionally the Graphics Processing Unit, now powers phones to self driving cars and to more recently, processing power behind research into COVID-19 using folding@home. Its an industry which ends up feeding into so many other «traditional» industries and adapting to the new advances in technology, be it hardware or software based, can be difficult. It always kept me feeling like there was always something else – and new – that I had to learn. It’s a great challenge to have professionally, especially if you can be directly involved in those advances, but it always keeps you on edge and on your toes..

AT: What do you think will be the next disruptive technology in the world of graphics/gaming?

CR: I think scale has got to be it. The scale of virtual worlds, the amount of players, interactivity and crossover between gaming mediums. For me, gaming has always been a social endeavour and the likes of Fortnite has shown how much scale can change things – huge, global events occurring real-time in a game environment with friends. Whilst Fortnite certainly struck a winning formula, there are plenty more that will exploit technology to this effect. There is a huge amount of data out there and a lot of it can be used in games. I certainly hope a creative aspect follows, along the lines of Media Molecules’ Dreams on PS4 – a stunning use of technology that lets anyone create virtual worlds and experiences. For this we will need ever-more processing power – both local to the player and remotely in cloud servers, and of course, higher bandwidth low latency connections for those who want to take part. I’m lucky enough to live in an area where internet access is not a problem, but cheaper, wider pipes to everyone – possibly via low orbit satellite systems like SpaceX Starlink – will allow for this increase in scale. Its going to be a very interesting and exciting decade to come in this area!

AT: Thank you Colin!

Isaac

Apasionado de la computación y la tecnología en general. Siempre intentando desaprender para apreHender.

One thought on “Colin Riley: exclusive interview for AT

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *

Si continuas utilizando este sitio aceptas el uso de cookies. más información

Los ajustes de cookies de esta web están configurados para "permitir cookies" y así ofrecerte la mejor experiencia de navegación posible. Si sigues utilizando esta web sin cambiar tus ajustes de cookies o haces clic en "Aceptar" estarás dando tu consentimiento a esto.

Cerrar