Compiling ardupilot for PIXHAWK

Hey, 

I am diving into the ArduCopter code for my project. I am curious how the Arducopter repository is compiled for specific hardware (whether its APM or PIXHAWK). When I look at the Arducopter repositories, I come across lot of .pde (arduino specific files) which seems to be related to ArduPilotMega (an arduino based autopilot). 

I wonder how those .pde compiles into a working executable for PIXHAWK. or is it that I am missing some files.

If you can guide me to some tutorial/documentation, it would be really appreciated.  

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hi Robin, the same codebase is used for all platforms. The different platforms are accommodated in the HAL

    I had no problems following the instructions at the links posted by Xian06 the first time I compiled the code for the Pixhawk. If you are having problems at a specific step, let us know.

    • I have managed to compile the ardupilot code to create .px4 executables (ready to upload).

      I what did not understand is how are the .pde files gets converted into a huge .cpp file(like 15k lines huge). 

      What I am assuming is that the makefile condensed into a single .cpp file which is then compiled to create a pixhawk executable.

      Are my thoughts on the right track ?

This reply was deleted.

Activity

DIY Robocars via Twitter
Practice virtual race this Saturday; the real thing will be on Oct 3 https://www.meetup.com/DIYRobocars/
14 hours ago
DIY Robocars via Twitter
14 hours ago
Derrick Davies liked lisa TDrones's profile
15 hours ago
DIY Robocars via Twitter
Monday
DIY Robocars via Twitter
RT @SahikaGenc: AWS DeepRacer & Hot Wheels Track https://youtu.be/4H0Ei07RdR4 via @YouTube
Sep 14
DIY Robocars via Twitter
Sep 8
DIY Robocars via Twitter
RT @davsca1: We are releasing the code of our Fisher Information Field, the first dedicated map for perception-aware planning that is >10x…
Sep 8
DIY Robocars via Twitter
RT @SmallpixelCar: How this works: 1)object detection to find cones in single camera image, 30 frames/sec on @NVIDIAEmbedded Xavier. 2)comp…
Sep 8
DIY Robocars via Twitter
RT @SmallpixelCar: Use two color cones to guide the robocar. No map needed, on onsite training needed. Just place the cones and it will fol…
Sep 7
DIY Robocars via Twitter
Sep 7
DIY Robocars via Twitter
RT @roboton_io: Great to see http://roboton.io running at 60fps on the cheapest #chromebook we could find! #edtech #robotics #educat…
Sep 3
DIY Robocars via Twitter
RT @openmvcam: Crazy in-depth article about using the OpenMV Cam for Astrophotography: https://github.com/frank26080115/OpemMV-Astrophotography-Gear https://t.co/BPoK9QDEwS
Sep 3
DIY Robocars via Twitter
RT @openmvcam: Hi folks, it's finally here! Our first draft of our Arduino Interface Library is out! It works over SoftwareSerial, Hardware…
Sep 3
DIY Robocars via Twitter
RT @chr1sa: Please let them have an open API. This would be perfect for @DIYRobocars races https://twitter.com/NintendoAmerica/status/1301513099707658246
Sep 3
DIY Robocars via Twitter
RT @SmallpixelCar: Lanenet pretty much used all my GPU power on @NVIDIAEmbedded Xavier since I optimized with tensorRT. I need to run anoth…
Sep 3
DIY Robocars via Twitter
RT @LyftLevel5: Our @kaggle competition on Motion Prediction for Autonomous Vehicles is now live! Experiment with the largest-ever self-dri…
Aug 24
More…