Hello!

I've got problem with reading content of web pages using QNetworkAccessManager and QNetworkReply. I've got code:

Qt Code:
  1. void MainWindow::getXML()
  2. {
  3. qDebug() << "Getting content..." << endl;
  4.  
  5. QNetworkRequest request(QUrl("http://www.google.pl"));
  6. qDebug() << "Network request..." << endl;
  7.  
  8. NetRepl = NetAccMan.get(request);
  9. qDebug() << "Network reply..." << endl;
  10.  
  11. connect(NetRepl, SIGNAL(readyRead()), this, SLOT(parseXML()));
  12. qDebug() << "Connect..." << endl;
  13. }
  14.  
  15. void MainWindow::parseXML()
  16. {
  17. qDebug() << "Ready to parse";
  18.  
  19. QByteArray newData = NetRepl->read(2048);
  20.  
  21. qDebug() << newData << endl;
  22. [...]
  23. }
To copy to clipboard, switch view to plain text mode 

It reads almost every content, but there are some pages that I can't get content (it doesn't go to parseXML()). For example my WWW server (it's simple http server based on small AVR controller) generates simple pages with data that I want to parse:

Qt Code:
  1. <RESP>
  2. <FUNC>some_func</FUNC>
  3. <VAL>true</VAL>
  4. <DATA>
  5. <D1>1</D1>
  6. <D2>2</D2>
  7. </DATA>
  8. </RESP>
To copy to clipboard, switch view to plain text mode 

Only this and nothing more. Do I need some header? How does it work because I'm a little bit confused.

thanks in advance
best regards
Tomasz