NVIDIA's press release states that "Jetson TX1 is the first embedded computer designed to process deep neural networks -- computer software that can learn to recognize objects or interpret information." The 3.4x2inch module includes a Tegra X1 ARM Cortex-A57 processor with 256-core NVIDIA Maxwell graphics, 4GB of LPDDR4 memory, 16GB of eMMC storage, 802.11ac WiFi and Bluetooth, and Gigabit Ethernet support.
AnandTech Article: http://www.anandtech.com/show/9779/nvidia-announces-jetson-tx1-tegra-x1-module-development-kit
The Jetson TX1 Development Kit will be available for preorder starting Nov. 12 for $599 in the United States. The kit includes the Jetson TX1 module, a carrier board (pictured below), and a 5MP camera. The stand-alone module will be available in early 2016 (for $299 in bulk).
The Jetson TK1 (not TX1) was released in 2014 to encourage the development of products based on the Tegra K1 processor. However, according to AnandTech, developers were using the Jetson TK1 outright as a production board, choosing to focus on peripheral and software development instead of system hardware development. With the new TX1, all of the I/O connectivity is provided on a carrier board, enabling rapid development on the credit-card sized TX1 module. After development is finished, the TX1 module can be directly deployed in products, such as drones.
NVIDIA used a drone application to promote the Jetson TX1
Comments
Sweet, thanks for the update. One less thing to do myself... ;-)
@JB, the RTK support will be supported natively in the Pixhawk although I don't think the Pixhawk really does much except pass data that it's received from the GCS to the GPS. So all the hardwork is done in the ground station (I think). So MichaelO of mission planner fame has been adding support to both MP and ardupilot and both should arrive for two RTK GPSs with Copter-3.4. Now I don't know if that processing has to be done on the ground in the GCS or whether it could be moved to a companion computer.. I suspect it needs to be done on the ground with the help of another stationary GPS.
Agreed Randy.
GPS is super sensitive in comparison to other RF, but at 1200-1500Mhz I'm hoping it's not affected as much by USB3 itself running at 2-5GHz. Some testing is definitely required, even for the processor clock etc, and we should put together a list of tests to provide an indication of how each system is impacted by the other.
BTW I posted a question on the Agbot thread regarding RTK support on the Pixhawk. Will the PXH support it natively without companion computer and roughly what will be the hardware requirements for it to work? Thanks.
--
Lol it seems I don't know which right to write right. ;-)
For RF interference, the interference on the GPS is the one I worry about the most. So many new device impact it's ability to get lock. I've seen the LidarLite, Go-Pro, RPI-camera and some others all impact the GPS.
Am I write in assuming that USB3 causes interference along the whole USB3 interface and not just at the connectors? If so then the connectors might not be contributing that much in comparison to PCB layouts. I'm also wondering now if CSI etc have the same issues or not.
RF layout in UAV's is always difficult, especially when trying to fit everything on small airframes. It will be hard I imagine to create a universal solution for every configuration. Even more reason for SDR I suppose, you can monitor RF whist you move things around.
But in saying that not everything critical is in the 2-5Ghz band. Telemetry and 4G is mostly outside of this so there's at least some options if it gets bad. Definitely worth looking out for though in the design phase.
@ Jurgen
This is an interesting paper from Intel concerning our previous discussion about usage of PicoBlade connectors.
USB 3.0 Radio Frequency Interference on 2.4 GHz: http://www.usb.org/developers/whitepapers/327216.pdf
SUMMARY:
The noise generated due to the USB 3.0 data spectrum can have an impact on radio
receivers whose antenna is placed close to a USB 3.0 device and/or USB 3.0
connector. The noise is a broadband noise that cannot be filtered out, since it falls
within the band of operation of the wireless device (2.4–2.5 GHz). The noise degrades
the signal-to-noise ratio that the wireless receiver sees and limits its sensitivity. This
then reduces the operating wireless range of the device.
Improving the shielding on the USB 3.0 receptacle connector can help reduce the
amount of noise radiated due to USB 3.0 signaling. In addition, shielding of the USB
3.0 peripheral device plays an important role in reducing the amount of noise radiated
in the 2.4–2.5 GHz range. This is particularly critical for peripheral devices that are
placed close to the PC platform, such as a flash drive. Placement of the wireless
antenna should also be carefully considered on a platform and be located as far away
as possible from a USB 3.0 connector and/or device.
Here's an example on how to reduce this noise:
Point Grey are issuing the same warning:
How can I minimize interference between my USB 3.0 camera and wireless devices?
This articles offers advice on minimizing the impact of any interference between USB 3.0 cameras and wireless devices up to 5 GHz.
Certain USB 3.0 devices and cables have been known to cause some interference with wireless devices, such as wireless GPS units. This interference is due to noise from the USB 3.0 data spectrum falling into the wireless device's frequency.
To mitigate this interference, users can:
1) Put as much distance as possible between the camera and the wireless antenna.
2) Shield the USB 3.0 camera and/or USB 3.0 connector.
3) Invest in a higher gain antenna for the wireless device.
That is an important reminder about taking great care on the board design, shielding , connectors and cables selection. With USB3 we are now getting into the 2-5 Ghz frequency band and that is impacting on radio link , gps and othe essentials functions.
That's a really cool use-case, I like. It's exciting because there's a lot within grasp, given the capacity for a "high level of autonomy", as Patrick mentioned earlier. Can't wait to take-off!
Looks great Dustin.
Sounds like a worthwhile en-devour and I can't wait to see what comes out of it.
From my renewable energy industry perspective there's a lot of potential for imagining applications in power management. Especially in operation and maintenance optimization, and particularly in distributed power generation.
I always say you can only make an effective decision if you understand the problem. Most power generation control is fairly antiquated and operates "in the dark" without the necessary inputs so there is a lot of room for improvement.
One item we're working on is cloud (shadow) monitoring with CV through which its possible to predict both load and solar generation on a microgrid, to reduce the size of storage batteries (by some 90%) and more easily manage spinning reserve and DSM. In this case we are looking up at the sky rather than just looking down from a UAV! It always surprises me how many decisions we make as humans are based on visual cues, yet until now, and in comparison, we haven't bothered enabling imaging technology as much as we should. ;-)
Yes I mean, a carrier board for the Jetson Module. This is basically what we are talking about here.
There is no concessus on the architecture for the moment, this is why yesterday I told Jurgens to consider splitting the developpment in 2 versions
A) STD : HDMI-USB
B) Multicamera: CSI2
Well, the way you would architect it would be with the Jetson module. Chip-level designs outside of NVIDIA aren't supported with TX1 or in the future. Having everyone do their own TK1 designs (myself included) was problematic with the memory layout, numerous sw configurations, ect. It will be better to adopt the Jetson module pin-out, which already closely mimics the capabilities of the Tegra chip (you can do the 6 CSI cameras, for example), and which NVIDIA will automatically provide new backwards-compatible modules for with new Tegra chips. So once you design the carrier once, it should be as simple as a plug-n-play upgrade with new Tegra's, as opposed to time-consuming & expensive BGA design each generation. I used to be in chip-level camp myself but I'm liking the standardized Jetson module and associated roadmap.