Sunday, April 9, 2017

Windows Server Backup - Delete old backup

I had turned on daily backups on my Win server, and in time I had accumulated almost 1000 backups from the last 3 years. The Backup UI took ages to do any operation requiring enumeration of those (even changing the backup schedule), and unfortunately there is no easy way to delete old backups from the UI. The solution there is to use command line. Here are some useful invocations:
  • wbadmin get disks - lists the disk in system and the space used by the backups
  • wbadmin get versions - lists the backup versions
  • wbadmin delete backup -keepVersions:10 - deletes older backups, keep the last 10 backups
  • wbadmin delete backup -version:04/08/2017-10:16 - deletes the specified backup version

Sunday, November 27, 2016

Cookie limits - browser side

If you search the net for 'cookie limits', you'll find this site http://browsercookielimits.squawky.net/ (or variations of it). I was baffled that max total cookies size is a 'guess' within a fairly large interval. So I set to write my own version and hope to get more accurate numbers.

Here  is the result: http://alinconstantin.com/download/browsercookies/cookies.html

It turned out the limits guessed were accurate already (no variation interval was necessary). But it was interesting to learn more about cookies and javascript. Here are a few notes.
  • JavaScript support for cookies sucks, it's a weird mechanism. Setting a cookie is done by setting a property, document.cookie, but reading back that property returns all cookies set (just names and values, even though cookies set could have other properties like path or expiration dates)
  • If you set cookies from JavaScript, don't forget to remember the names of cookies set in a separate list! If you set cookies with values that go over the browser limit per domain, browsers like IE/Edge will clear up the property document.cookie, and you'd have no way to enumerate existing cookies (to know what to delete before being able to set new cookies). Fwiw, this behavior is browser dependent, Chrome/FireFox will drop oldest cookies instead...
  • IE support for JavaScript sucks. In IE11, string.startsWith(), string.repeat() are not implemented, delegates var f1 = () => {dosomething();} are not understood, etc. Edge is better in this regard.
  • Chrome is silly, not allowing using cookies when scripts are run from file:// locations.
  • IE & Edge have very low limits for total cookies size. You probably don't want to send 10k cookies with every http request, but I've seen websites hitting these limits... 15k would have been more reasonable (and closer to the 16k default limit of header sizes in server side). On the plus side, 5k per cookie is better than the ~4k of all other browsers (and I've seen websites hitting that limit in other browser, too)(
Anyway, here are the results for cookies limits for the major browsers as of writing this article:

Browser   Max bytes/cookie   Max cookies   Max total bytes for cookies
IE11 & Edge 5117 50 2*5117 = 10234
Chrome 54 4096 180 180*4096 = 737280
Firefox r50 4097 150 150*4097 = 614550
Opera 41 4096 180 180*4096 = 737280

Thursday, November 24, 2016

Cookie limits - the server side story

If you search the net for 'cookie limits', you'll find this site http://browsercookielimits.squawky.net/ (or variations of it) that list browser-side limits for cookies for a couple of browsers.
RFC2965 will tell you a browser should support at least 20 cookies of size 4096 bytes per cookie, but browsers usually support higher limits. E.g. Chrome supports 180 cookies of size 4096 bytes, per domain, with no limits for the total size of all cookies. That makes 720Kb of data that is allowed by Chrome in each request.

In reality, even if you insist of sending that crazy big amount of data with every http request, you'll discover it's impossible to use that many cookies. Depending on the server accessed, you may be able to use only max 3 cookies of size 4096 bytes! Why? Because there is another side of the story - the servers you are accessing will also limit your use of cookies sizes.

Those limits depends from http server to server, and the server response if you make larger requests varies, too. Here are some examples:
  • www.microsoft.com - throws SocketException / ConnectionForcefullyClosedByRemoteServer after ~16k max cookies
  • portal.office.com - Starts returning "400 Bad Request – Request Too Long. HTTP Error 400. The size of the request headers is too long" after max ~15k cookies
    www.google.com - Starts returning 413 Request Entity Too Large after ~15k cookies
  • www.amazon.com - Starts returning 400 Bad Request after ~7.5k
  • www.yahoo.com - Accepts requests up to ~65k, after that returns 400 Bad Request 
  • www.facebook.com - Accepts about ~80k after that starts returning 400, 502 or  throws WebException/MessageLengthLimitExceeded (seems dependent on the number of cookies, too)
Per https://support.microsoft.com/en-us/kb/820129, IIS Server defines two configuration settings, MaxFieldLength and MaxRequestBytes that limits the size of the http request headers that are accepted. This includes things like the RequestUrl being accessed, the User-Agent string, AAD authentication tokens, etc, thus limiting the size of Cookies stored in headers, too. For IIS, that limit is 16Kb by default, and can be configured. Probably Apache has similar limits, and website owners may have adjusted the limits.  

If you're writing a web application and use cookies pushing the limits, it's important to know what your server will tolerate on incoming requests.

I wrote an app one can use to test and get an idea of the server limits. You can download it from
http://alinconstantin.com/Download/ServerCookieLimits.zip and invoke it with the http:// Uri of the server to test for parameter. The app makes requests to the server with cookies of various decreasing sizes, trying to narrow down the accepted max cookies size. The output looks like in the picture below.

Saturday, September 5, 2015

Useless router WiFi speeds and maximum Surface Pro 3 Wifi speed

I got puzzled last week by my Surface Pro 3, which usually connects at 60-90Mbps to my router, regardless of which band (2.4 or 5Ghz I choose). I knew Surface 3 had some problems with low WiFi connection speeds when it was released, but since then Microsoft has released drivers and firmware updates that were supposed to fix the problem. I upgraded to an AC router, Surface Pro also has AC adapter, so why am I not seeing better speeds?
My WiFi router is a Netgear Nighthawk R8000, which boasts a 3.2Gbps WiFi speed. That's just for PR, in reality, it has one 2.4GHz band with max 600Mbps and two 5GHz channels, each supporting max 1300Mbps. So, the max speed of connection is limited to the max speed of the band I'm using. But, that's not the end of the story - both my Surface Pro 3 and my wife's laptop connect at maximum 866.5Mbps, and that's when staying 2-3 fests apart from the router. The speed is actually negotiated between the router and the client device. If I move 10 feet away, the speed starts dropping to 700Mbps. If I stay in living room, the speed drops to 80-90Mbps.
Surface 3 Pro has a 'Marvell AVASTAR Wireless-AC Network Controller' Wi-Fi adapter, and based on http://www.marvell.com/wireless/avastar/88W8897 it's maximum WiFi speed is 867Mbps.
I'm reaching this speed (so Microsoft kept its promise and fixed the low speed problem), but I have to be feet apart from the router to reach it. And even in these ideal conditions I'd need at least 4 Surfaces to saturate the two 5GHz channels plus more WiFi devices connecting on 2.4GHz to reach the advertised router's speed... The router speeds are just a PR gimmick.


Sunday, November 2, 2014

Digital Photo Professional cannot edit CR2 RAW files

Today I spent almost one hour trying to figure out why Canon DPP was not able to edit some pictures I took a while ago. All the pictures in one folder were shown like this:

DPP_NoEdit

Notice the glyph on top of each image indicating editing was not allowed.

I searched 3 times the menus for some option to unblock editing, but there was none. I thought of files being read-only on disk, having wrong ACLs. Nothing. Some website suggested for images being edit-protected from the camera to access an unblock option from the Info window, but that was completely empty instead of displaying EXIF info.

EmptyInfo

It was only happening with images in one folder, so I moved an image out of that folder, but nothing changed.

Hours later I viewed the images in Explorer from a different computer, and then I noticed something odd – why was the CR2 size so small as compared with other pictures? Were they corrupted?

Files

And then it hit me – when I took those pictures the camera battery run out on my 5D III and I had to use my old camera, a Canon 20D. And DPP was not able to open the files from this older camera…

After a little digging on the net, I had the confirmation: Canon has released Digital Photo Professional 4.0, but only for 64-bit computers and only for certain cameras like Canon 5D Mark III. Older camera like Canon 20D are not supported by DPP 4, and instead I had to download the previous version, DPP 3.14 to edit the raw files. It turns out that even new cameras from Canon like 7D mark II are not supported by DPP 4.0, on either 32 or 64-bit Windows. Hopefully Canon will reconsider and add compatibility support for all the cameras when they release a new version of DPP 4…

Thursday, September 18, 2014

How to install Active Directory (AD) tools on Windows 8

Start by installing Remote Server Administrator Tools For Windows 8.1 to get the 'Remote Server Administration Tools' components, then Turn Windows features on/off, and make sure to select 'AD DS Tools'.

This article describes in great details the steps
http://www.technipages.com/windows-8-install-active-directory-users-and-computers

Sunday, June 1, 2014

My Wi-Fi connection not using full speed of 300Mbps

 

I have two Netgear Wi-Fi routers that have Wi-Fi connections enabled with speeds up to 300Mbps. However, the laptop, tablet, etc connects to them with speeds usual in the 78-144Mbps range, never over 150Mbps. This didn’t bother me much as these speeds are still over my broadband connection speed (60Mpbs), and I don’t transfer many files between laptop and other computers in the network. But still, why does this happens?

The documentation says “The WNR3500 router will use the channel you selected as the primary channel and expand to the secondary channel (primary channel +4 or -4) to achieve a 40 MHz frame-by-frame bandwidth. The WNR3500 router will detect channel usage and will disable frame-by-frame expansion if the expansion would result in interference with the data transmission of other access points or clients.”

I thought the low speed was caused by router settings. My router had channel 4 set as primary, which left only 4+4=8 as secondary. I thought some interference on channel 8 was preventing it to be used, so I changed the primary to 5, with 1 and 9 now as options for secondary. But that didn’t increase the connection speed.

Today, after digging more, I looked on laptop at adapter’s settings. There, the Channel Width on 2.4GHz range was set to “20 MHz Only”. I set it to Auto, let the laptop reconnect, and voila! Now the speed increased, reaching values in the 270-300Mhz range, as it should have.

image

image

It didn’t make sense, why won’t be this set to Auto by default? Then I remembered. It was.

3 years ago I was experiencing frequent connection drops, and a lot of reconnecting – I was not able to maintain a RemoteDesktop connection to work without the laptop pausing for reconnect every couple of minutes. It was really annoying. And it was me who limited the channel width to 20MHz, which seemed to reduce the number of connection drops.

Well, now I have a second Wi-Fi router to extend the range, and the laptop’s connection at 300Mbps seems more reliable now. So I guess I’ll keep the laptop’s channel width back to its default settings.

Unfortunately the Surface RT’s network adaptor doesn’t have a similar setting, so the tablet will have to connect to 150Mbps max. No loss there until Comcast will allow such speeds at reasonable prices.