Yes, the War of 1812 helped pave the way for a special relationship between the United States and United Kingdom. The war better established the United States as a major power that should be treated as an ally and not as an enemy. Had the war not taken place, the United Kingdom would never have had the respect for the United States like it did from that point on. That respect help special relationship develop and grow.
The war of 1812 was just the United Kingdoms way of attempting to reassert a presence on the North American Continent. The revolutionary war took the UK by surprise, and they felt that they were not militarily prepared. Once the US was labeled a viable force, the UK felt it could come back and reclaim the land from the usurpers. The War of 1812 settled that fantasy. The UK was defeated on their terms, and would not make another attempt, knowing the Americans were a strong and well organized force. After all, most were British trained.
The war of 1812 showed that the Brits were still able to fight the Americans, but that the Americans still had more power and sent them home packing. After this, both countries decided it would be better to work together than continue to fight. It wound up being good in the long run.
I believe that the War of 1812 and the principals of that war did help to define the relationship between the US and the UK. The war proved that the nations were more equal than might have previously been thought, and helped to push past lasting issues from the revolutionary war. Signing a peace treaty made the nations feel a need to work together, and that promise of unity so long ago still holds today.
No, the War of 1812 did not pave the way for the special relationship between the United States and the United Kingdom, because wars rarely bring people closer together. The United States and the United Kingdom were at odds in the war of 1812. It was not until years later that they realized their commonalities.