In our last post, we combined ZeroMq & Google Protobuf, a combination that enables heterogeneous distributed computing systems. The last post integrated the two using C++, this post will focus on integrating the two using Python. In the end, we'll have an example that can communicate with the previous C++ example seamlessly.
Below is a Makefile, used to create and clean up Google Protobuf message files.
$ cat Makefile
msgs:
${SH} protoc -I=. --python_out=. Messages.proto
clean:
${RM} Messages_pb2.py
Our sender script is below, notice the change is that of the content exchanged, specifically the serialized protobuff message.
$ cat sender
#!/usr/bin/python
import zmq;
import time;
import Messages_pb2
context = zmq.Context();
pub=context.socket(zmq.PUB);
pub.bind("tcp://127.0.0.1:8000");
for i in range(0,10):
p=Messages_pb2.Person();
p.id=i;
p.name="fatslowkid";
print "iteration",i
pub.send(p.SerializeToString());
time.sleep(1);
The receiver;
$ cat receiver
#!/usr/bin/python
import zmq;
import time;
import Messages_pb2
context = zmq.Context();
sub=context.socket(zmq.SUB);
sub.connect("tcp://127.0.0.1:8000");
filter=""
sub.setsockopt(zmq.SUBSCRIBE, filter);
for i in range(0,20):
print "waiting on msg"
M=sub.recv();
p=Messages_pb2.Person();
p.ParseFromString(M);
print "received",p
print "> " + p.name;
print p.id;
The sender/receiver can be used together, or used with the previous C++ example.
Cheers.