c# - Should EndReceive ever return zero if the socket is still connected? -


I'm calling the following BeginReceive a socket:

  m_socket.BeginReceive (M_bfr , 0, M_bfr. Lang, Soketflagskpartiy it. Risev callback, null);  

My classes in ReceiveCallback function I call

  try {int bytesReadFromSocket = m_socket.EndReceive (AR); If (bytesReadFromSocket & gt; 0) {// Some processing date}} if {end (m_socket.Connected) m_socket.BeginReceive (m_buffer, 0, m_buffer.Length, SocketFlags.Partial, this.ReceiveCallback, null); }  

The issue I am coming to is that Andrew is returning zero, but the M_set.connected truth is coming back, so I get the call again. It occurs in a tight loop but does not stop when you finish the document, so it is not clear from the documentation. I only do it when the socket is closed, but it seems to be true.

So the question remains, what can the conditions be endReceive assumed zero return under?

After

Socket.EndReceive () Specific case in return 0: Start remote host SocketShutdown.Send is either done or accepted pretty closed sequence (such as a socket for a .NET program, called Socket.Shutdown () ). or SocketShutdown.Both ).

However, note that technically, until the close of the socket end, it is "connected".

Whether you want to issue a second reading from the property socket to determine the connected . Instead, since 0's return value is specially reserved, to indicate that no other data will be sent, you should only Andresive () and call BeginReceive () Again if the value is a positive number (not zero)


Comments

Popular posts from this blog

apache - 504 Gateway Time-out The server didn't respond in time. How to fix it? -

c# - .net WebSocket: CloseOutputAsync vs CloseAsync -

c++ - How to properly scale qgroupbox title with stylesheet for high resolution display? -