Hello everyone and welcome to our Webinar 16 reference designs extend real time prototyping beyond simulation. My name is Malay and I'm the offering manager for our wireless research products at NI, and I'll be moderating our webinar today. Before we begin, we wanted to cover a few housekeeping items. At the bottom of your screen are several engagement tools you can use. Feel free to move them around and get the most out of your desktop space. Additional materials are available in the resource list. We encourage you to download any resources or links that you may find useful. For the best viewing experience, we recommend using a wired Internet connection and closing any unnecessary browser tabs or programs running in the background to help conserve your bandwidth. For the best audio quality, please make sure your computer, speakers or headset are turned on and the volume is up so you can hear the presenters. You can also find additional answers to some common technical issues located in the health engagement tool at the bottom of your screen. The webinar is will be recorded and an on-demand version of the webinar will be available approximately a day after. Will e-mail you a link to access the on-demand webinar as soon as it's ready. And lastly, please feel free to submit any questions you have during the webinar through through the Q&A engagement tool. We are available to answer any questions during the webinar and we'll also have a live Q&A session once the presentation is finish. The webinar was recorded earlier and you know and we can be available to answer any questions in real time. So let's get started. Hello everybody and thanks very much for joining. In today's webinar, we'll be discussing how software defined radios are being used to research and demonstrate wireless technologies. It's hard to talk about wireless research without talking about the geese. While we haven't quite settled with 5G yet, the vision for 6G is just being defined. A great example of this is that even though 5G millimeter wave is not ready or broadly available yet, 60 is already looking at sub terahertz. It's exciting to imagine some of the new applications that could be possible in a 6G world. Immersive reality, wireless cognition, mobile holograms, to name a few. These applications translate to system requirements and finally how we implement them. Their technology the exciting part for researchers. We see several common areas of interest expansion into new frequency bands, augmenting communication with sensing data, and using AI and ML for various network and system optimizations. Talking about new Spectrum 3, GPP has already identified more than 21 gigahertz of spectrum above 100 gigahertz for possible 60 consideration. This is one of the most difficult topics to build a demonstration around, but we could probably have to get you started. It's hard to imagine any modern technology that is not a use case for AI and ML and wireless systems is no different. There are applications across the board, all the way from intelligent rig to optimizations in the physical layer. And I have been proud to be working with leading researchers since the very early days of 5G to tackle such challenges. We serve our customers through a flexible software based platform that gives them the ability to rapidly prototype new ideas as well as test and refine existing ones. In those days we started we were among the first to demonstrate massive MIMO millimeter wave feasibility and full duplex communications, all extremely relevant even today. The NIST R platform covers a wide range of form factors, processing capabilities, and frequency bandwidths to accommodate the needs of many different use cases. From the Ultra Portable E series to the powerful FPGA based MIMO and millimeter wave systems. We've got something for everyone. So how do we build a wireless system using STR? For a typical wireless testbed setup, we need a combination of radio frontends and computational and storage options. And it provides various possibilities for each. The VST provides an instrument grade calibrated front end, whereas the USRP'S could be used for higher channel count applications. Of course, software plays a critical role in how we choose to build the systems. The Estr platform supports Labview, MATLAB, and open source tool flows for system as well as FPGA design. The application as well as the expertise available could help determine which option works best for you. Speaking about software, one of the most popular 5G protocol stacks available for the USRP is the Open air Interface 5G ran from Eurocom. To share more about OEI, I would like to welcome Florian Carlton Berger, who's a professor at Eurocom and an active member of the OAI team. Hi, Florian. Thank you for joining me. Hi, Marty. Thanks. Thanks for having me. You know, we were just getting into talking about the OSI software stack and so we're really glad that we could have you here. And I wanted to just start off by getting to understand a little bit about what's eurocom's vision for OSI. Yeah. So you're com vision for OSI has been and still is to provide a reference implementation for three GPP based cellular systems like 4G and 5G today and in the future of course 6G. So we started off as as a research initiative. But recently in the last years we got a lot of attention from from industry. And so to maybe refine that the vision for the for the future, we want to make sure that open interface finds its way into different products as well and we want to really make it. Not only the the the reference implementation that people can play with, but also you know, the reference implementation that people can use and build products on. And and that's that's especially important for. Um for open and virtualized run where where most of the functionality is being run in software. So we see open interface as one of the key components of this virtualized open run networks. Yeah, thank you for sharing. Florian, I think you touched on some on really, really interesting points. You know Oran definitely is one of the key kind of architectural changes that we see coming into our wireless systems and you also talked about virtualization, right. And that's also going to be I think one of the key topics that you know bring the clarification etcetera of some of the future, you know that's some of the future applications. Are going to require, right? So how do you see? So the OEIS road map, how do you see that developing? How do we continue finding ways to push the boundaries of what communication systems can do? Yes. So indeed that's a big challenge. To have a fully virtualized run is a big challenge because there are some elements in the round that take a lot of processing time and cycles, most notably the channel coding and decoding. So at the. To be deployable you will need some type of hardware acceleration. And and we have realized this some some time ago with with open interface already and we have done some work in that direction. Just recently we demonstrated the offloading of the LDBC decoding to a hardware accelerator. Um and and we have other projects ongoing where we're you know we'll be, we'll be working with. Other partners to offload the complete to complete layer one. To two different hardware acceleration cards. I think that those are definitely some of the trends that we are seeing, right and. You know, as you start talking about hardware acceleration and you know the kind of performance that's needed to look at some of these new to build some demonstrations for these new technologies that people exploring. You know, this like kind of the software defined radio architecture becomes almost like a fundamental building block to these systems, right? And so given that an I of course has been developing these usurps which form some of the kind of most widely used SDRS. So how do you see NI and Eurocom working together? To support some of these future work that's needed, so the years apiece have been a big enabler for open interface starting already several years ago. I think open Air interface would not have had the success it has today without the without the earpiece and we're very happy to see that also that the USP product line is is evolving. I think the the entry 10 is a great platform for for 5G and and also the recently introduced X410. Is, is, is is going even further than that and and I'm I'm also very happy to see that. You know, you made some partnerships with external RF providers which now actually allow you to use the user P together with some external RF front end to provide. Your power and bigger coverage, so you can actually build a full even bigger sales, small sales with the USPS. So that's great. And the USP, I think it will remain an important platform in the future, especially if we look to things like millimeter wave. You know there are things still a lot of things that that that we, you know that researchers are exploring and and playing with so, so that's important and also. You certainly know that artificial intelligence and machine learning will play a key role in future systems like 6G. And there, you know, platforms like the years appears are essential because they allow you to collect measurements and just play with the stuff in. Left environment. Yeah, yeah. Thanks. Thanks for sharing. I think those were, those were some really kind of interesting points about how. You SRP's have enabled and both OAI usurps and it's become truly an ecosystem play, right. And it's no longer about kind of 1 company or one piece of software or one piece of hardware that researchers typically need, right. And you talked about AI and ML, you know, that's kind of has its own large ecosystem that you know now suddenly has become interested in in wireless, right? So one thing that always makes me curious, you know, when I kind of think back to the 4G days and now you know, moving to 5G and maybe even more so in 6G-4G was still a lot just about communication, right? Like how do you build a really good communication, a really efficient communication system 5G kind of did this foray into looking at it more from. Use cases that were beyond communication, right and now you know like and some of the development started and now on 60 it becomes. I would say a lot of it is about applications now, right. It's no longer, you know, data rates are already fast, but and now there are a lot of these other things that people want to get out of a communication system. So given this kind of transformation that we have seen, how important do you feel? You know, like prototyping and test beds. And, you know, building demonstrations start to become, you know, as opposed to. You know typically where like theoretically, proving concepts was really important in the earlier days of communication. Yeah, yeah, you're you're absolutely right Molly that test beds are becoming more and more important especially because the systems get get much more complex. I mean 5G is already an order of magnitude more complex than 4G and 6G will be even more. So you know to these systems you know you can't describe them in a you know on on on paper anymore that's not sufficient and especially as you said with the introduction of AI and ML. In any case, you need you need data, you need you need data to train those models, and the only way to get those data is on test beds. It would be very difficult to train, to train your system on a on a live network. So. So these test beds are very important and I'm very happy to see actually the whole community that a lot of effort is put into this and we see test beds popping up everywhere and they're the big, big money. Government money is also spent on on building such such test beds and then to a good part of them built. With the with you or somebody equipment and yeah, so this is, this is great and we'll we'll need all of this for 46G. We need to have these test beds ready to get 6G's up and running and train those aiml models that we need for 6G. Excellent. Thank you. Thank you for that insight, Florian. And I, you know, I completely also see the fact that. You know, they're going to be a lot of different ecosystems that will need to come into play. You know, as we look at, you know, applications beyond communication and having an open ecosystem both like a software platform and a hardware platform will be critical to support research in these many different directions. It was great to have you, Florian. Thank you for all the information and thank you for joining us today. Thanks for having me money. As Florian mentioned, an end to end 5 GNR network can be a great starting point for many researchers who are trying to build on 5G. This consists of putting together the core network, the G node P and the radio on one side and the costs and Scots UE or the software on the other. However. We have learned that setting all this up can be quite a painful and time-consuming task. Who said to support researchers in getting the system up and running faster and less painfully, we created a reference architecture that lays out the details of the hardware and the software needed, as well as how you configure the system. We have Neil joining us today to show what this reference architecture system looks like. Hello, welcome to the OSI reference architecture for 5G and 6G research with the SRP. This is a demonstration video of of our reference architecture system. This reference architecture provides a system that's based on open source software and open standards. Several USP devices are supported, and so are several different UE implementations as well. In this specific demonstration video will show you a reference architecture based on the following hardware and software. For the Genode B, we'll use the US RPX 410. And for the UE, we'll use a quectel RM-500Q wireless modem module. We get SIM cards for that modem module from open cells.com. And the host computers controlling both the genome B and the UE, as well as the core network are based on Intel Core I7 and I-9 CPU. The software we're running is Ubuntu 20.04 point 5. We're using Linux kernel 5.15 and on the genode B system we're using the low latency kernel. We're using UHD, the SRP hardware driver version 4.2 point 0.0. And for OAI we're using version 2022 point W 33. We're also using a specific version of the QUECTEL firmware. This is a system diagram showing you the overview of our system. On the left is our core network system, again running Ubuntu 20. Oh four as we said a moment ago. And it's based on an Intel Core I7 CPU with four cores running at about 3.4 gigahertz. It's you. It's using kernel 5:15 and OSI version 1.3 OI Core Network version 1.3. And that system is connected to a switch. The switch also is connected to the G node B which is also running Ubuntu with the low latency kernel. The Genode B is connected to the US RP, so it's running UHD. the US RP hardware driver version 4.2 point 0.0. This machine is more powerful than the core network system. It's a it's an I-9 CPU, a 10940 X with 14 cores running at 3.3 gigahertz. It's connected over dual 10 Gigabit SFP Ethernet to the US RPX 410 radio. The radio is then connected over RF coaxial cable to the quectel UE module. Those are connected through a USB M2. Adapter board to the COTS UE computer, which is another Core i7 system with four cores and running at 3.6 gigahertz. The specific quectel firmware version number can be seen there below. This is a diagram of the RF cabling between the radio and the COTS UE, the Quectel Katsuie. We take the four outputs from the quectel UE and put them into a splitter A4 way splitter. There are four antennas on the QUECTEL module. And we take the common lead from the splitter and. Put that through 40 DB of attenuation. That is then connected to a two way splitter to connect the RX and the TX ports on the SRP. Next, I'll show you a couple of photos of the system. You can you can see here. This is the overall photo of the entire system, showing the quick tail UE, the quick tell UE computer host computer, the two Intel servers, the core network server and the. Do you know B server and the X410 radio, the switch that's in the middle? You see the two way in the four way splitter. We'll see a better picture of that in a moment. And then of course the two monitors, keyboards and mice to control and drive the systems. These are the system specifications that we have for this demonstration today. We're operating in 5G essay or standalone mode. We're using TDD duplexing. We're using a 40 megahertz channel bandwidth with 30 kilohertz subcarrier spacing. The downlink is using 64 QAM. We're operating inciso mode, so just one by one. And we're operating in FR1 unbanned N 78, which is 3.361 gigahertz. The downlink data rate. Is up to 80 megabits per second as given by Iperf. And the sampling rate being used on the SRP to support that Channel bandwidth is 61.44 mega samples per second. I'll mention. That there are various resources you can use to get further technical information and support. I'll come back and mention this at the end of the video as well, but there are basically three ways you can get help. One is the application note that's published on the Edison Research Knowledge Base, which is a comprehensive write up and documentation around this reference architecture and includes all kinds of things including a bill of materials and configuration and assembly instructions, and shows you and walks you through the process of. Building the entire reference architecture and replicating everything that you see here in the video today, as well as other things, as well as other configurations and and and other configurations. There's also three mailing lists that are relevant that can be very useful for getting technical questions answered. And finally, there's e-mail emailing Edis research national instruments directly. And without further ado, we will begin the demo and I'll switch my screen now to my desktop and start showing you each of the three machines, and. We'll start running the 5G network OK here I will start and invoke the core network system. I'm on the core network system now, and I have copied a couple of commands I'll need throughout this demo in this text editor in this README file, and I'll start by just confirming our system configuration. And so I'll just show. You that we have. The right kernel version running here we have kernel 5.15 and we'll just confirm that we're running Ubuntu 20 O 4. And so we see that that's confirmed. So that's correct. And so we'll go ahead and start the core network now. So I'll go to that folder and then I will invoke. This Python script which will start the core network. OK, the core network should now be running. Let's verify that. And we see that the various Docker containers are now running and and have been created, the AMF, the SMF, etcetera. So so this looks correct. This looks good. I'll check with Ifconfig that. All the interfaces were made correctly. We see a whole bunch of interfaces, virtual interfaces there, some physical interfaces here, and then the demo OAI interface here up at the top with the 70.129 IP address. So that looks correct. That we can just verify that. With Wireshark. And we could take a look at the demo API demo OAI. Interface and make sure that we see traffic between the various Docker containers and this looks correct. We see various messages being passed, we see the heartbeat request and response, and so there is traffic flowing between the various containers. Later, once we invoke the G node B, we'll see the Ng AP messages going between the G node B and the AMF on the core network system. So we'll see that shortly once we start the genode B, for now, I'll stop the capture. And exit Wireshark. And then finally, one last thing I'll point out is that later when we want to look at the log files, we can do that. Using this command. We can look at the AMF log files. Right now there's nothing there since we don't have any G, node B or UE running at the moment. And we can look at the SMF log files. And we'll come back and look at that more closely later in the in the demo. Next, we'll go ahead and invoke the G node B system in the next section. I'm now on the G node B system and we will start the G node B. And the first thing I'll do is first check that the US RPX 410 is properly working and connected to the host computer here. So I'll run the UHD find devices command. This command will search for all US RP's that are connected to the computer directly over Ethernet or over PCI Express. In this case, one radio is found and we see the the two IP addresses corresponding to the the dual 10 Gigabit Ethernet links and the one Gigabit Ethernet management port. So that looks correct. Now let's run the probe command. And this will look a little bit more closely at the device and verify that the radio can communicate with the host computer. The timeout event you see there is benign and won't influence anything. OK, great. The command run. There were no errors reported. The device can communicate with the host computer. The output shows things like the UHD version number up at the top. And then version numbers for the hardware. Um. And the FPGA firmware and and things like that. Clocking sources and timing sources are if not blocks that are on the FPGA, the various static connections on the device, the tuning ranges and bandwidth ranges of the two daughter boards inside. So all of this looks correct. This is exactly the configuration we would want, and there are no errors, so I think we're ready to start the G node B. And I'm already in that folder, so I will invoke the G node B directly. And before I do, I will first start Wireshark. And I will show you the ngap messages that are sent between the G node B and the core network. So let's start Wireshark and we'll capture on the No1 interface. And I'll let that run. I'm going to put a filter here. Right now there are no end gap packets yet, but there will be in a moment. Let's start the G note B. It's now invoking UHD and talking to the radio. OK. And now it's up and running and we see the two. Ng setup request and Ng setup response packets sent between our system 50.72, the G node B system and the core network Docker container. So the genome B is now up and running and we are ready to start the UE system and so we'll do that now in the next part. OK, now we are on the UE machine. This system is connected to the quectel. Quickel module and I have already invoked the quick Tel. Connection manager. And that's in this window here on the upper left. I'll bring up another terminal just to show you how you would invoke it. I put. The Quectel manager in. The downloads folder. And in that quartel CM folder you can see there's some source code files and then the quectel binary. So I had run this as pseudo dot quectel dash CM. And that's that's all you need to invoke the the connection manager. And I did that a few moments ago and you can see some output here. I'll show you up at the top how that looks. So. I invoked it and it started. And it confirms that at this point that it created the WN0 interface and then it it it waits for a connection. And there were some issues with with Mac address on this particular SIM. So it took a few moments to catch and get the right Mac address, and then once we had the right Mac address it makes the connection. And so this is all all set up. Now now this is running and you can see that it has created the W0 interface and I will bring up another terminal and show that to you. If I type if config. We see here now the WN0 interface is up and it has been given an IP address of 12.1 dot 1.2 so that looks correct. And we're we have a DHCP lease here. It's telling you the lease time is 7200 seconds and and some other output here. This is all correct. And we now have been given an IP address from the core network and the correct UE is now ready to communicate. It's on the network and ready to go. I also have in this other window. Minicom, which is a terminal program that allows me to send at commands to the QUECTEL module directly the way you would invoke that. I've already invoked it, so I won't. I won't do it again, but I'll show you how you would invoke it and you would run it as root. So pseudo minicom dash D and then slash dev slash TTY USB three. That's the device name for the USB serial interface. Do the quick tail. And when you run that, you'll get a screen that looks like this. I'll go ahead and. Remove that terminal window. You get a screen that looks like this. And it tells you the port and the time and things like that that it was compiled on and so on. And then there's a bunch of commands you can you can you can run at commands you can run. Some of them are diagnostic commands that tell you information about the module and the SIM card and and things like that. Others are these CFU n = 1 = 0 commands that basically turned the module on and off. And so I have been toggling the module here and finally in in one of the most recent. Times that I turned it on. It was able to make a connection and and that's when the output in this window appeared and the Wan interface was given a DHCP lease for an IP address from the core network. And now the the the correction module is on the the network and and running and ready to go. So now the UE system is up and running and so let's go to the next part where we'll start operating and communicating with the cocktail UE from the core network. OK, we're now on the core network machine and we're ready to stream a video from the core network to the UE, so. Let's do that and let's first. Make sure that we're still connected to the UE from the core network and so we'll ping. And make sure that we're connected. OK. The ping looks like it's working. So we seem to have connectivity to the UE from the core network. So let's go ahead and and stream a video and to do that I'll. Had a route. OK, I added a route to our routing table here so that. We can stream from the core network to the UE and I will then run VLC. To start this streaming. So I'll come here and I'll select a file this and I anthem file here, I'll stream that. This looks good. And destination will be UDP. Sorry, I. Forgot to put the address here. Let's go back and do that. That should be the address that that the UAE has. When we in the last. Part of the video when we looked at the Quectel connection manager. So that should be correct. There we go. And. Let's go ahead and stream. OK, so this video is now playing here locally, but it should also be streaming and should be visible on the quick till you eat. So let's go and take a look on the UE computer if we can receive that video stream. Right now we're back on the. US system and the video should be streaming from the core network to to this system here and should be visible. So let's go and open VLC. And let's play a network stream. Here we'll open the network stream. And this is already set to the correct port. And so we can simply click play. What's up, ears? Big thinkers and the video is playing and science is greatest caretakers, the ones who stare down failure and say not today. So this video is playing now successfully from the core network to the technology UE computer over our 5G network, through the SRP and over the cable and through the QUECTEL modem module to the host computer and being displayed here. So let's come together. So this is. Working the things they say can't be done. Let's engineer hope for those who need it the most. And that concludes our innovation, our demonstration of the system where we just show a video playing in real time across the network. Our moment. Please stay tuned for additional. Demonstration videos that will show additional features of the reference architecture. Thank you very much for watching. Thank you, Neil, for walking us through that. Some new technologies may be difficult to explore in an end to end network system at the start. To be able to record and analyze IQ streams is one of the most powerful ways we can look at new technology and abstract it from the complexities of the upper layers of the protocol stack. We can use it to understand how we waveforms behave under different conditions and what is the impact of the real environment. A good example is understanding of reconfigurable intelligence surfaces. It can be difficult to model all the effects of multiple air interfaces and surfaces without collecting real-world data with such SDR systems, and this is what's needed to understand the feasibility of RIS. To enable this kind of research, we have put together an SDR system that can seamlessly record, aggregate and transmit IQ streams at high bandwidth and over multiple frequencies. It has usurps, front end and storage options along with a reference software that can get everything going. This can provide a great starting point for research on integrated sensing and communication, RIS and millimeter wave beamforming and beam steering applications. We are also working on expanding the frequency and bandwidth capabilities with our newly released VST 3. Using the right front ends, its frequency range can be extended from millimeter wave to sub terahertz, making it ideal to explore new spectrum for 60. We're also working on adding additional signal processing capability to manage the high bandwidth applications. Just want to wrap this up by talking about AI and ML. I'm glad to see it finally get incorporated into the wireless domain. There are so many possible applications for AI and ML working on the different layers all the way from high level network orchestration and management to optimizing channel estimation techniques in the physical layer. Whatever we do, one thing is clear and that is that the data to train the models will be critical for this research, and that's where N is starting its exploration journey. With that, let's get into the Q&A session. Umm. Yeah. Thank you everybody for joining. I I hope you were able to get a good idea of how the system works with the demo. We have some great questions in the Q&A and I'd like to invite Karen to share the questions with us. OK. Thanks, Malay. Hi, everyone. So let's go one by one. So the first question we got from Anwar to kanav. There is a question regarding. I assume that video recording will be available for the of this session will be available later. I think answer is yes, it will be available and also for the configuring the. Oyes panel you can also contact us, we can help you with that kind of things. The answer to that question is yes. Next point is the second question that I see in a Q&A session. It is coming from you UDM. What is the function of X410 in this demo in current demo or export and operate as a Radiohead? It just doing the IQ samples up down conversion to the our. Good frequency band which is in our case band 78 and it is doing just a simple processing of the digital signal processing of up, down, conversion and resampling. No functionality of AI right now implemented in the USRP. OK. The next question again from the Urim, how do you can you connect the 100 Gigabit of X 410 to the 10 Gigabit? What cable you use? Actually there is a two options to connecting 100 Gigabit of. X410 to the PC in one case we are using Intel E 810. 100 Gigabit card which requires also high powerful computer in the second case. In case of 10 Gigabit connection we are using breakout cables which is using on one side QSFP connector which is connected to 100 Gigabit Ethernet. On the other side you are having 410 Gigabit connectors. With that cable you can connect to the multiple ports of your network adapter not network interface card and that could help you have connected your USRP. To the your PC. The next question. No. Anxo Tato how we can find find information about the US RP able to transmit with the higher power commercial applications? A very good question. So actually the output transmit output power is available in USRP documentation and you maybe see that in some cases. It could be lower than the three GPP requirement. So our company working to have extension for the SRP having to be able fit the three GPP requirements. Just look to our web page and you will get more information about. Also for additional information you can contact Amar Haji Omar who is the business development manager of the company and also me or Malay. We can answer you directly too. The next question. Can all 5G mobile phones can be used as a UE for the testing the platform? And if not, which are supported? Very good question. So currently based there is a some list available on a a web page but we were tested with two type of mobile devices. We test quectel RV 500 QGL which was using Neil during his demo. During this video demo and the second mobile phone which was tested it was a Google Pixel 5A. Other forms we didn't test it and we need to follow the OAI web page to under to have a newer updates from them there from them site. OK. The next question, we do not have another question in the Q&A right now, but I would like to highlight a few questions which we are receiving from the customers who are planning to reuse. The open Open Air interface reference architecture and one of those questions those questions is does the reference architecture works only with X410 USRP? Uh. The answer is no. It also supports the USRP X300 series also and 300 series. In the demo Neil was using an X 410 USRP, but if you pay attention in a demo the configuration file name was entry 10, so it from there you can easily see that it supports also different types of USRP, even B series USRP. But for sure with the B series USRP you should have limitation. There is another question that we are listening very frequently. What UHD version it supports. The OAI supports UHD 4.24 point 3. That is the supported version. Another interesting question we which I heard how you are going to FR2. Actually. The SRP. Not supports FR2 directly so USRP supports only Fr one. For going to the Fr two you need to use some up down converter and we are working with the company team tech who building their up down converters. Which can be used with the X410 USRP and we have some demos where we are doing beamforming using with the team tech beamforming antennas, doing some demos and showing some cancellation using 100 megahertz bandwidth of RF signal. Also there is a. Another interesting question which I heard during this time. What 3 GPP bands are support? Which three GPP bands are supported by the reference architecture? If we look from the US RP side, USRP supports all bandwidth and. From the reference architecture side, it supports 3 GPP FR1 TDT duplexing bands. For the Fr tool you can see some configurations available. In the shared project, but it is still in the development by the OAI and will be available soon. What channel bandwidths are supported? In the demo, Neil was using 106 PRBS, which means that you were using 40 megahertz bandwidth. Also we are support also we were tested with 60 and 100 megahertz channel bandwidths in the Fr one. OK, let me switch back to the Q&A session. I see there is a few more questions comes. Again from Anwar to Kwana. A technical one. Is it possible to interface the system USRP subsystem with off the shelf and our device? The answer to that question is yes, and during the demo we were doing that actually Neil was using. Quick tell RM 500 of the shelf mobile device which was connected if I'm not mistaken by the USB to the laptop and running the. Mobile station there. The second part of the question, are the USRP compliant with the band emission of the over the air transmission? A very good question. Uh, as far as I know there is no measurements on that, but I'm I'm sure that. That during the development of the export 10 we tried to. Keep everything compliant to to the three GPP using the emission parameters. So for the more details we can have direct contact and do some measurements and answer your question in more details. The next question. Ohh. Is there really required to use low latency kernel on a genode B with core I712 core 3.2 gigahertz? Actually it is a very good practice to use the low latency kernel to not have a late. I did some try with core. I mean. 18 course processor without low latency kernel and with that process processor with which is running up to 4.8 gigahertz. If I'm remember correctly the parameter I in some cases I was getting late in a data communication between US RP and the genot BPC. So my recommendation will be yes use low latency kernel to get more stable operation. Then with the X 410 is now possible to run the application, transmit and receive with the saving the IQ samples in the same storage device. Is that right? Cannot this be done with the? Could be 210 and X310. If I understand the question correctly, we are not talking in this question regarding the open air interface. Correct me if my understanding is incorrect. If I understand correct and we are not talking about. Open Air interface, we have another application which give you ability to record the samples using export and device to the rate and stream direct into samples to record them also for the generation. The demo that I mentioned where we are running the Fr 2. Evm measurement with the NRF signal. Is doing exactly that. I see. No more questions. What time we have Malay or Josephine? You know, I think thank you, Karen. That was great. I think we are exactly at time and we appreciate everybody who could join us today. And as we mentioned at the beginning of the video, this recording for the video and the slides will be available at a later date on the NI website and everybody can access it then. Thank you everybody and then have a good day. Thanks everyone. Have a good day.