Well, first off, you have conflicting values of cl_interp and cl_interp_ratio. cl_interp and cl_interp_ratio are used to determine the client interp value, which is a single decimal value measured in seconds.
cl_interp_ratio sets the client interp value to be cl_interp_ratio divided by cl_updaterate, while cl_interp is a direct value for the client interp value. The client chooses the highest value among these two for the final client interp value.
You've seemed to flip the cl_interp_ratio value that is used conventionally for projectile and hitscan classes.
Now, let's talk about what interp actually is. Interpolation is an actual word, not just a Source Engine term. You may have also seen lerp in the net graph. This simply is short for linear interpolation, which is something you probably did in school when you calculated a line equation from the rise over run of two points. This is a way of determining a linear, continuous path from one discrete point to the next.
Now, in the real data that you recorded those points from, the actual continuous path might have been a zig zag, might have curved down, anything really. There's a large amount of realistic possibilities, but how large that amount is depends on the distance between the two points on some difference on an axis, let's call this delta x. For smaller and smaller delta x's, you can imagine approaching the point where the distance between the two points along that path is so small, where the delta x between the two points is practically zero and so the path could only really be a straight line (what we call dx).
For any delta x larger than that, there is a possible path error that could occur, and that possible error gets larger and larger as the delta x increases. So, when we do linear interpolation, we don't want to do it at a large delta x that would result in this kind of error. That will get us the best result.
Now, how does this stupid math stuff relate to Team Fortress 2? Well, the server runs its game logic every 0.015 seconds, and the results of the game running logic at those discrete steps get sent to your PC through the Internet at an interval of cl_updaterate times a second. So you're (theoretically) getting discrete points of data at a delta x of 0.0151515151 seconds if you have cl_updaterate set to 66.
The result would be players teleporting around from point to point at a refresh rate above 66Hz. So, interpolation needs to be done by the game so that these players smoothly travel along a linear path from point to point. But, people with even 60Hz monitors need interpolation at this amount, because the Internet isn't perfect. Sometimes packets are sent out of order, delayed, or dropped completely and the server can even slow down its own tick interval from 0.015 seconds to something higher that results in maybe even 40 packets a second instead of the commonly requested 66 due to high CPU load. In addition to interpolation being needed for rendering updates, interpolation also smooths out these irregularities in update intervals.
Let's expand on this imperfection in the Internet further. Interpolation requires two points to estimate the linear equation between two points, obviously. So, these delays where the effective update rate drops below 66 will change when you'll get these two points of data to do your interpolation. If you have your client interp set to 0.0151515151, and the next packet doesn't arrive in that time, then your game has no second data point to interpolate between the last one and this one. So instead, it has to entirely guess the second data point through a process known as extrapolation. Source uses linear extrapolation, where it takes the current velocity of the player and just assumes that the player continues with that velocity for the next required seconds. Since this can have a huge error, it's limited to 0.25 seconds of extrapolation, after which it just freezes movement.
Interpolation is obviously better than extrapolation, because of what we discussed previously about delta x's. These update intervals within TF2 are small enough, with velocities slow enough, for the linear approximation between two real data points to not have a large error. However, extrapolation is not going off of any real data of where the player is in the future for its prediction, so it will have a much larger error.
In order to avoid this issue, you need enough packets in your buffer queued up for interpolation so that the game can use a spare packet instead of guessing very inaccurately. This is why it's generally recommended nowadays to have your interp above 0.01515151, so that there is that extra packet if you need it at least some of the time. To guarantee a spare packet in the interpolation buffer all the time, you'll need your interpolation at 0.0303030303.
Now, a lot of people argue that interpolation adds a visual delay on where players are and that you need the lowest delay possible, ignoring any extrapolation risks. This is true for projectiles, but the visual delay is fully lag compensated for hitscan. This is why they recommend 0.01515151 interp for projectiles and the far safer 0.03030303 interp value for bullets.
However, I'd argue that the accuracy in current position that you get from a lower interp is far less valuable for projections than accuracy in trajectory, because predicting where people are GOING is going to be much more important than seeing a time accurate representation of where they exactly are right now. In addition to this, interpolation will always result in an accurate position history because its following what's happening on the server, albeit delayed, whereas extrapolation will be wrong in both trajectory and position.
So, I recommend a cl_interp of 0.030303 and a cl_interp_ratio of 2, if possible. However, I've also tuned a value based on what I've measured on normal packet jitter + drops on servers + human jitter tolerances. This is cl_interp 0.021 and cl_interp_ratio 1. Use whatever works for you. I would recommend the former for bullets only/when precision is needed for hitscan (sniper).
Now there are other network settings as well. I recommend cl_updaterate 66 and cl_cmdrate 65. Why? This is because of client prediction. For your own player, the client will just simulate your actions locally using its own server code because that's probably what the result is going to be from the server anyway. However, there is a bug in this where if the two value are equal, client data is preferred over server data. So, I set the server update rate to the highest, and the client update rate to one lower to avoid this issue. Use cl_pred_optimize 1 to prevent this issue instead, keeping both at 66.
I also recommend different packet size/compression settings.
net_compresspackets 1;net_compresspackets_minsize 128;net_maxroutable 1200;net_maxfragments 1200
The net_maxroutable and net_maxfragments is based on Valve's updated metrics on good packet sizes in CS:GO and Dota 2. The compression settings are to revert a change by Valve made in TF2 many years ago which bumped up the compression threshold to reduce load on servers. For clients however, this value should be much lower since you're compressing much smaller client update packets and generally have less load than a server.