So these "real turtles" got me thinking....
#1
Posted 25 January 2013 - 02:08 PM
#2
Posted 25 January 2013 - 05:02 PM
Wired2coffee, on 25 January 2013 - 02:08 PM, said:
#3
Posted 25 January 2013 - 05:04 PM
The best you can get at the moment is just the Lua interpreter.
There is no Sun/Oracle Java for armfh yet, and neither minecraft nor most of the java programs that go with it will run under openjdk or java se.
Not even the computercraft emulator.
I have manged to get one dependency library to compile under openjdk, namely LWJGL, but it took literally four hours mucking with compiler flags and chasing down compile errors.
You can read my research results in this thread.
#4
Posted 25 January 2013 - 05:39 PM
dissy, on 25 January 2013 - 05:04 PM, said:
#5
Posted 25 January 2013 - 05:58 PM
Wired2coffee, on 25 January 2013 - 05:39 PM, said:
Yes that is very possible.
Lua is designed as an embeddable language. The Lua site itself talks about how to do this with C/C++ mainly, however there is a Python package called Lupa that lets you include a Lua interpreter into Python. Another is Lunatic Python.
The idea is that in Python, you declare Lua functions, and Python can read and control the Lua variables.
Basically you would implement all the CC specific commands in Python.
So for one example, you would create a python function and bind it to the Lua command "turtle.forward"
In this case it's easy, just have the python function return success. Now when the Lua program says "turtle.forward()", instead of an error that the command doesn't exist, the program will get "true" back and think the turtle moved.
Ideally the python program will have some x/y/z variables and actually track where it thinks the turtle is. So in the python call, have it increment that X var.
Later you can bind a python function to the "gps.getCords" and have it return where it thinks it is.
It would be a lot of work to implement all the functions needed to keep a Lua program happy, but it's more just tedious than hard to do.
I've done similar things using TCL and C before. Not only do you get a config file format that calls commands to set things up and has access to data in variables, but all the scripting abilities of TCL right in your config file! In that sense, TCL and Lua are very similar.
As a bonus, you'll get to add "Developed software language emulator" to your resume ;}
#6
Posted 25 January 2013 - 06:01 PM
Edit: wow, this post was in a reply to Wired2Coffee's one. I'm going to read dissy's huge wall of text now.
#7
Posted 25 January 2013 - 06:09 PM
Other than that, shouldn't be a problem
#8
Posted 25 January 2013 - 06:13 PM
#9
Posted 25 January 2013 - 06:20 PM
It's all about embedding Lua in other languages.
The top is mainly about C, but if you scroll down to "Other Languages", there are plenty to choose from. All I can suggest is use what ever host language you are most comfortable in and familiar with.
Orwell: What's funny/sad is, I was thinking the exact same thing.
Ever see those boards with roads/tracks drawn on the top, and magnets on motor controlled pulleys underneath, where you place a little car with a magnet in it on the board and it gets moved about?
I have 2 spare RaspPi's, a couple IO expanders, and some dead flat bed scanners for parts I was seriously considering wiring together ;}
Then all that's left is giving it a web interface for control, pointing a webcam at it, and put it all up on the Internet for heavy abuse
#10
Posted 25 January 2013 - 06:25 PM
NeverCast, on 25 January 2013 - 06:09 PM, said:
Other than that, shouldn't be a problem
The ARM 1176jzfs chip in the Pi is actually bi-endian. It supports both big and little endian modes internally.
The ARMs CPU instructions are fixed as little-endian, but as for data it's just a matter of either declaring a block of memory to be big endian, or just setting a flag before doing the copy and it will auto convert.
This is the specific low level ARM docs that go over the details.
#11
Posted 25 January 2013 - 06:27 PM
I happen to have spend some time the last 2 days on researching the use of the Raspberry Pi in robotics. I plan on experimenting with stereo catadioptrics to navigate a robot. I'm also planning on requesting it as the subject for my bachelor thesis because it would demand a terrible amount of time.
#12
Posted 25 January 2013 - 06:28 PM
dissy, on 25 January 2013 - 06:25 PM, said:
NeverCast, on 25 January 2013 - 06:09 PM, said:
Other than that, shouldn't be a problem
The ARM 1176jzfs chip in the Pi is actually bi-endian. It supports both big and little endian modes internally.
The ARMs CPU instructions are fixed as little-endian, but as for data it's just a matter of either declaring a block of memory to be big endian, or just setting a flag before doing the copy and it will auto convert.
This is the specific low level ARM docs that go over the details.
#13
Posted 25 January 2013 - 06:29 PM
#14
Posted 25 January 2013 - 06:31 PM
Cranium, on 25 January 2013 - 06:29 PM, said:
#15
Posted 25 January 2013 - 06:36 PM
One of my 4 Pi's is ear-tagged for a robotics platform of my own. I have a mount that was 3d printed to afix it to my AR drone. Using a wifi dongle it can join the drones wifi network and send it navigational commands. I've also got a usb gps working in debian to plot courses along way-points I can drop on a map and upload to it.
With the Open Computer Vision library, I'm hoping I can get it to parse the two cameras on the drone, and be able to recognize faces and movement, and then react navigationally on it.
Ever since I was a small child I've wanted a robot I could tell "get the kittie!" and it would obey >:}
#16
Posted 25 January 2013 - 06:42 PM
Edit: You were talking about depth calculation using the two camera's right?
Edit2: I've also given up on believing that the Raspberry Pi could do the image processing itself. I read people getting a 1.5 framerate using a resolution of 320x200 doing simple candy edge detection...
#17
Posted 25 January 2013 - 06:54 PM
Orwell, on 25 January 2013 - 06:42 PM, said:
I haven't actually attempted it yet myself. But I know a lot of people use USB cameras when they should know the Pi's USB bus is very underpowered.
There is an SPI connector on the Pi that interfaces directly into the ARM cpu, and Broadcom claims they can do HD video at 28fps or 320x280 at almost 60.
Of course they make the thing, so possibly would say that either way. But yes over USB you won't get much bandwidth, especially shared with the network (The ethernet is on the USB bus internally, as would be any wifi dongle)
It took me almost 5 hours to compile OpenCV from source on the Pi thou. That was back before I had my cross compiler setup on my i7, I should try it again using distcc just to compare compile times heh
Unfortunately the AR drone's ARM is too busy running real time processes for flight stabilization, on top of streaming the video signals and relaying commands. If I can't offload it, it won't get done :{
#18
Posted 25 January 2013 - 06:57 PM
Edit: My bookmarks on this:
http://eduardofv.com...he-Raspberry-Pi
http://www.fanjita.o...Arch-Linux.html
#19
Posted 25 January 2013 - 11:24 PM
#20
Posted 26 January 2013 - 01:03 AM
Alekso56, on 25 January 2013 - 11:24 PM, said:
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users











