Hilarious Rage Ban
Started by Luffy-4-, Mar 26 2012 06:40 PM
17 replies to this topic
#1
Posted 26 March 2012 - 06:40 PM
I am not here to whine, i just came to share my hilarious experience with my dear friend Sculder!!! Well after i killed him a couple of times he started complaining about my reg and how he couldnt even see me due to my so called "lag" even though no one else had previoulsy complained about (weird huh!?!) he decided to ban me!! And i ask myself, isnt the ping limiter destined to do that job or was i "misled" ?!? Anyway Sculder my friend i suggest you start a stand up commedy show, you are way to funny for Setti
#3
Posted 26 March 2012 - 07:00 PM
Hmmm!! So you are saying that my ping is 137 and the ping limiter is at 120 but i dont get kicked, oh i understand its perfectly logical!! Oh if you are kind enough to asnwer me how is it possible when i press "TAB" my ping would be around 95-105?? Oh i get it i am either blind or the 100 is the new 137 in the 21st century!!! Thats the spirit!!!
#9
Posted 26 March 2012 - 10:48 PM
Thats what i am trying to say if he was right i would be kicked through the limiter!!! So either the limiter is broken, although i doubt it because i had been kicked a lot of times while using torents or so, or i dont have the required ping to get kicked so my ping would be under 120!! Thats what i am tying to say either fix the limiter so i cant join because i "lag" so much or leave me in pieace because K1ller or i dont know who made the limiter like that so i and some other players with a bit higher ping rates can play peacefully!! Is that so hard to understand???
#12
Posted 27 March 2012 - 12:36 PM
Don't know how that script works exactly but my guess it counts average in period of time. You lag then after your death ping drops. The average is ok. When I had net problems in evenings year ago, 100 latency, players complained about me. Breezer had same problem cause we're on same isp. Instead of complaining here, you could have made a call to your internet provider.
Just to make clear admins > system plugins
When we had KAC installed and cheaters were running freely we were still banning them manually.
@GR Gavno
Instead of shouting "admins fix lag" give some suggestions how.
EDIT:
@Sponsi
latency <> ping
Just to make clear admins > system plugins
When we had KAC installed and cheaters were running freely we were still banning them manually.
@GR Gavno
Instead of shouting "admins fix lag" give some suggestions how.
EDIT:
@Sponsi
latency <> ping
#13
Posted 27 March 2012 - 01:20 PM
As far as I know there is no easy way to see your real ping in source games ( vanilla ).
But there is an option for mod developers ( using stuff from SDK ) to get players true ping and I bet 10$ that Mani coders are smart and they use true values for their ping kicker and not some faked values from scoreboard or net graph. Don't ask me why Gaben fakes ping cuz i don't know.
atm I'm searching through SDK, to show and explane to you how Gaben fakes scoreboard and net graph ping.
But there is an option for mod developers ( using stuff from SDK ) to get players true ping and I bet 10$ that Mani coders are smart and they use true values for their ping kicker and not some faked values from scoreboard or net graph. Don't ask me why Gaben fakes ping cuz i don't know.
atm I'm searching through SDK, to show and explane to you how Gaben fakes scoreboard and net graph ping.
#14
Posted 27 March 2012 - 02:27 PM
// GetFrameData is the CNetGraphPanel's class method ( function ), it is used to calculate values that are displayed in the net graph panel. Lerp, fps... stuff I sliced off. void CNetGraphPanel::GetFrameData(INetChannelInfo *netchannel, int *biggest_message, float *avg_message, float *f95thpercentile) { m_AvgLatency = netchannel->GetAvgLatency( FLOW_OUTGOING ); // in microseconds, client's average outgoing latency retrieved through netchannel interface is stored in the CNetGraphPanel's floating point member variable ( m_AvgLatency ). Ping 100ms = 0.1 float flAdjust = 0.0f; // declaration and initialization of floating point variable. if ( cl_updaterate->GetFloat() > 0.001f ) // if client's cl_updaterate cvar value is bigger then 0.001 do the math between brackets. { flAdjust = -0.5f / cl_updaterate->GetFloat(); // flAdjust = -0.5 / 66 = -0.0075757575757576 m_AvgLatency += flAdjust; // add flAdjust to the real ping value } m_AvgLatency = max( 0.0, m_AvgLatency ); // ping value is clamped, so it can't be below zero. }
example for 100ms ping:
0.1 + ( -0.5 / 66 ) = 0.0924242424242424
to get ping in milliseconds multiply that number with 1000 ( this is done in the CNetGraphPanel's class DrawTextFields method ).
So, it means that in net graph you see ping lower for 7.5ms compared to the real ping.
void CNetGraphPanel::DrawTextFields(int graphvalue, int x, int y, int w, netbandwidthgraph_t *graph, cmdinfo_t *cmdinfo) { Q_snprintf( sz, sizeof( sz ), "fps:%4i ping: %i ms", (int)(1.0f / m_Framerate), (int)(m_AvgLatency*1000.0f) ); }
Scoreboard ping display is faked even more, for me at least 20-25ms. And another factor here is once per second scoreboard panel update rate.
I'm too lazy to search and analyze the scoreboard info, but if you compare the net graph and scoreboard info you can see the differences from an airplane.
#15
Posted 27 March 2012 - 02:58 PM
The facts based on my observations:
I have ADSL connection 4M / 512k
my ping to the ISP server and back is between 15ms and 18ms. Average 17ms.
my ping to the Setti server and back ( cmd ) is between 50ms and 55ms. Average 53ms.
my ping connected to the Setti server with 10 bots playin' is around 60ms - 62ms, with server full around 75ms - 80ms.
So, my conclusion is that for each bot on the server my ping raises up for 1ms.
I have ADSL connection 4M / 512k
my ping to the ISP server and back is between 15ms and 18ms. Average 17ms.
my ping to the Setti server and back ( cmd ) is between 50ms and 55ms. Average 53ms.
my ping connected to the Setti server with 10 bots playin' is around 60ms - 62ms, with server full around 75ms - 80ms.
So, my conclusion is that for each bot on the server my ping raises up for 1ms.
#16
Posted 27 March 2012 - 03:29 PM
Nice reading from wav the moon master. A lot of questions are answered here.
Understanding GoldSrc/Source Network stuff
Since a lot of people seem quite confused as to what interpolation is, prediction, lag compensation is, etc.
First off, what is interpolation? The server sends out updates to each client, the client stores the updates in a buffer. These updates form the basis of how interpolation works. Interpolation works by using your cl_interp value to decide the interpolation period and the interpolation period is also dependent on your updaterate and cl_interp_ratio. Interpolation works by using the oldest entry in the buffer, data that is in the past ( decided by cl_interp, cl_interp_ratio & cl_updaterate ) and smoothly animating to the latest update received. That's why animations are jerky when you disable interpolation because it is trying to animate using the freshest data all the time rather than having a buffer of updates to extrapolate between when animating. For instance, if we have an updaterate of 100, and fps_max of 300, we have to render data for 3 frames using 1 update. Or perhaps, updaterate of 100, and fps_max of 100, 1 update per 1 frame of animation.
The default value is 100 milliseconds of interpolation, that is we're using data from 100 milliseconds in the past up to the current update. This means that your view is 100 milliseconds in the past plus whatever your latency is. So for example if you have a 150 millisecond ping, your view with cl_interp set to 0.1 is 250 milliseconds in the past. If you set cl_interp lower than cl_interp_ratio / cl_updaterate, cl_interp is clamped to cl_interp_ratio / cl_updaterate. I won't bother discussing cl_updaterate but to say that it controls the number of updates per second that you are supposed to receive from the server under ideal conditions and that ideally should be set to the server tickrate. Remember that interpolation protects against dropped packets due to the way it works. See Extrapolation for a more detailed explanation.
Extrapolation, I really can't say too much about this because it's so intuitive. Extrapolation is used when we have nothing left to interpolate with. We try to predict where an entity will be and what it will look like in the future. For instance, we use the last received ping of the player, their velocity, their origin, and try to predict where their new origin will be. Keep in mind extrapolation is extremely inaccurate and only a last measure if interpolation totally fails. Example, we're dropping packets left and right. Normally in regular situations, interpolation is usually able to protect us against dropped packets by buffering updates to extrapolate between, but if we lose too many, we have to extrapolate ( try to predict the future - cue wonky music here ) with the last update(s) we got. This example is yet ANOTHER reason WHY YOU SHOULD NEVER DISABLE INTERPOLATION PERIOD. I can't stress that ENOUGH.
As far as prediction is concerned, cl_predict is the var here, and I don't have much to say about it. Prediction is simply so if you have a very high ping you don't wait for the server to tell you what changed with regards to your local player ( movement, firing your weapon, etc ). For instance if you had a 1000 msec of ping, or 1 second, if you pressed the fire button it would instantly fire ( on your client ). Whereas if you disabled cl_prediction you would have to wait a full 2 seconds for the server to tell your client you fired your weapon and animate it. Or if you moved, etc. Pretty obvious. This hearkens back to the Quake days when dialup was the predominate means of accessing the internet. John Carmack wanted to increase the responsiveness of the game ( for the local player ) to reduce perceived latency and therefore local player prediction using shared code just made perfect sense. Of course, the server is still authoritarian in regards to prediction, so if the client's prediction is too far off the server will set it right.
Now we come to lag compensation. One of the most important features and definitely a love/hate relationship with it for some players. What is it? Well lag compensation uses player data to remove the effect of latency when using hitscan weapons. Remember back to the Quake days, when you had to lead your target by your ping? Yeah this is what it's designed to fix. Lag compensation stores each player's usercmds in a buffer as well as their latency and interpolation. When a fire command is received and validated, lag compensation moves the server "back in time" to the moment when the fire event for that player occurred ( firetime = current server time - packet_time - player latency - interpolation period ). This means that player origins, angles, and such are moved backwards to that moment and the weapon's logic code is run. This can have some interesting effects for low latency players. For instance if a player with high latency attacks a low latency player, due to local prediction on the low latency player's computer he was out of the way of the shot, but to the lagged player ( due to interpolation and his ping ) the low latency player was in the path of the bullet. The server doesn't care and issues damage to the low latency player and he dies. There is another interesting piece. For instance, if a low latency player, for sake of of simplicity here, 15 milliseconds, and a high latency player with 150 milliseconds both press their fire button and both shots would kill either player, the lower latency player won't die. This is because the server processes his commands first because they were received before the high latency players.
Lag compensation is the main reason why ping prediction is unnecessary. However if you do care to use ping prediction to try and gain an advantage you will need to calculate the player's velocity and multiply it by your ping and add it to the aim position. Remember you will also have to disable interpolation or account for it.
Remember your ping fluctuates, and depending on the tickrate, and what part of the frame the server is in depends on whether you'll have an added penalty of having to wait another tick or so for the server to begin processing your commands. This can affect your TOTAL latency.
Total latency = lerp + latency
Total latency += ( 1 / tick_rate ); // IF the server did not receive your command at the start of the tick
Understanding GoldSrc/Source Network stuff
Since a lot of people seem quite confused as to what interpolation is, prediction, lag compensation is, etc.
First off, what is interpolation? The server sends out updates to each client, the client stores the updates in a buffer. These updates form the basis of how interpolation works. Interpolation works by using your cl_interp value to decide the interpolation period and the interpolation period is also dependent on your updaterate and cl_interp_ratio. Interpolation works by using the oldest entry in the buffer, data that is in the past ( decided by cl_interp, cl_interp_ratio & cl_updaterate ) and smoothly animating to the latest update received. That's why animations are jerky when you disable interpolation because it is trying to animate using the freshest data all the time rather than having a buffer of updates to extrapolate between when animating. For instance, if we have an updaterate of 100, and fps_max of 300, we have to render data for 3 frames using 1 update. Or perhaps, updaterate of 100, and fps_max of 100, 1 update per 1 frame of animation.
The default value is 100 milliseconds of interpolation, that is we're using data from 100 milliseconds in the past up to the current update. This means that your view is 100 milliseconds in the past plus whatever your latency is. So for example if you have a 150 millisecond ping, your view with cl_interp set to 0.1 is 250 milliseconds in the past. If you set cl_interp lower than cl_interp_ratio / cl_updaterate, cl_interp is clamped to cl_interp_ratio / cl_updaterate. I won't bother discussing cl_updaterate but to say that it controls the number of updates per second that you are supposed to receive from the server under ideal conditions and that ideally should be set to the server tickrate. Remember that interpolation protects against dropped packets due to the way it works. See Extrapolation for a more detailed explanation.
Extrapolation, I really can't say too much about this because it's so intuitive. Extrapolation is used when we have nothing left to interpolate with. We try to predict where an entity will be and what it will look like in the future. For instance, we use the last received ping of the player, their velocity, their origin, and try to predict where their new origin will be. Keep in mind extrapolation is extremely inaccurate and only a last measure if interpolation totally fails. Example, we're dropping packets left and right. Normally in regular situations, interpolation is usually able to protect us against dropped packets by buffering updates to extrapolate between, but if we lose too many, we have to extrapolate ( try to predict the future - cue wonky music here ) with the last update(s) we got. This example is yet ANOTHER reason WHY YOU SHOULD NEVER DISABLE INTERPOLATION PERIOD. I can't stress that ENOUGH.
As far as prediction is concerned, cl_predict is the var here, and I don't have much to say about it. Prediction is simply so if you have a very high ping you don't wait for the server to tell you what changed with regards to your local player ( movement, firing your weapon, etc ). For instance if you had a 1000 msec of ping, or 1 second, if you pressed the fire button it would instantly fire ( on your client ). Whereas if you disabled cl_prediction you would have to wait a full 2 seconds for the server to tell your client you fired your weapon and animate it. Or if you moved, etc. Pretty obvious. This hearkens back to the Quake days when dialup was the predominate means of accessing the internet. John Carmack wanted to increase the responsiveness of the game ( for the local player ) to reduce perceived latency and therefore local player prediction using shared code just made perfect sense. Of course, the server is still authoritarian in regards to prediction, so if the client's prediction is too far off the server will set it right.
Now we come to lag compensation. One of the most important features and definitely a love/hate relationship with it for some players. What is it? Well lag compensation uses player data to remove the effect of latency when using hitscan weapons. Remember back to the Quake days, when you had to lead your target by your ping? Yeah this is what it's designed to fix. Lag compensation stores each player's usercmds in a buffer as well as their latency and interpolation. When a fire command is received and validated, lag compensation moves the server "back in time" to the moment when the fire event for that player occurred ( firetime = current server time - packet_time - player latency - interpolation period ). This means that player origins, angles, and such are moved backwards to that moment and the weapon's logic code is run. This can have some interesting effects for low latency players. For instance if a player with high latency attacks a low latency player, due to local prediction on the low latency player's computer he was out of the way of the shot, but to the lagged player ( due to interpolation and his ping ) the low latency player was in the path of the bullet. The server doesn't care and issues damage to the low latency player and he dies. There is another interesting piece. For instance, if a low latency player, for sake of of simplicity here, 15 milliseconds, and a high latency player with 150 milliseconds both press their fire button and both shots would kill either player, the lower latency player won't die. This is because the server processes his commands first because they were received before the high latency players.
Lag compensation is the main reason why ping prediction is unnecessary. However if you do care to use ping prediction to try and gain an advantage you will need to calculate the player's velocity and multiply it by your ping and add it to the aim position. Remember you will also have to disable interpolation or account for it.
Remember your ping fluctuates, and depending on the tickrate, and what part of the frame the server is in depends on whether you'll have an added penalty of having to wait another tick or so for the server to begin processing your commands. This can affect your TOTAL latency.
Total latency = lerp + latency
Total latency += ( 1 / tick_rate ); // IF the server did not receive your command at the start of the tick
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users