Polymer.js: The Future of Web Application Development?

About a year ago in May 2013, Google launched Polymer.js.
So here we are, a year later. And the question is: Is it yet ready for prime time? Is it yet possible to create a production ready application using Polymer web development techniques?
To answer this question, I took Polymer out for a test drive to develop a web application and to see how well it would behave. This article is about that experience and what I learned in the process.

Polymer.js: The Concept

Before we get into our Polymer tutorial, let’s first define Polymer.js, not for what it claims to be, but for what it actually is.
When you begin to check out Polymer, you can’t help but immediately be intrigued by its self-professed unique world view. Polymer purports itself as taking a sort of back-to-nature approach that “puts elements back at the center of web development”. With Polymer.js, you can craft your own HTML elements and compose them into complete, complex web applications that are scalable and maintainable. It’s all about creating new (i.e., custom) elements that can then be reused in your HTML pages in a declarative way, without needing to know or understand their internals.
Elements, after all, are the building blocks of the web. Accordingly, Polymer’s weltanschauung is that web development should fundamentally be based on extending the existing element paradigm to build more powerful web components, rather than replacing markup with “gobs of script” (to use their words). Stated another way, Polymer believes in leveraging the browser’s “native” technologies rather than relying on an increasingly complex labyrinth of custom JavaScript libraries (jQuery et. al.). An intriguing notion indeed.
OK, so that’s the theory. Now let’s take a look at the reality.

Polymer Web Development: The Reality

While Polymer’s philosophical approach certainly has merit, it is unfortunately an idea that (at least to some extent) is ahead of its time.
Polymer.js places a hefty set of requirements on the browser, relying on a number of technologies that are in still in the process of standardization (by W3C) and are not yet present in today’s browsers. Examples include the shadow domtemplate elementscustom elementsHTML importsmutation observers, model-driven views, pointer events, and web animations. These are marvelous technologies, but at least as of now, that are yet-to-come to modern browsers.
The Polymer strategy is to have front-end developers leverage these leading-edge, still-to-come, browser-based technologies, which are currently in the process of standardization (by W3C), as they become available. In the meantime, in order to fill the gap, Polymer suggests the use of polyfills (downloadable JavaScript code which provides features that are not yet built into today’s browsers). The recommended polyfills are designed in such a way that (at least theoretically) will be seamless to replace once the native browser versions of these capabilities become available.
OK, fine. But let me get this straight. At least for now, we’re going to use JavaScript libraries (i.e., polyfills) to avoid the use of JavaScript Libraries? Well, that’s “fascinating”.
The bottom line, then, is that we’re in a sort of limbo mode with Polymer, as it is ultimately relying on (or perhaps more accurately, approximating) browser technologies that don’t yet exist. Accordingly, Polymer.js today seems more like a study in how element-centric applications may be built in the future (i.e., when all the necessary features are implemented in the major browsers and polyfills are no longer needed). But, at least at present, Polymer seems more like an intriguing concept than an actual option for creating robust change-your-view-of-the-world applications right here and now, which makes writing (or finding) a Polymer tutorial difficult outside of Google’s documentation.

Polymer Architecture

Now, onto our guide. Polymer.js is architecturally divided into four layers:
Native: Needed features currently available natively in all major browsers. Foundation: Polyfills that implement needed browser features not yet natively available in the browsers themselves. (The intention is for this layer to disappear over time as the capabilities it provides become available natively in the browser.). Core: The necessary infrastructure for Polymer elements to exploit the capabilities provided by the Native and Foundation layers.. Elements: A basic set of elements, intended to serve as building blocks that can help you create your application. Includes elements that provide: Basic functionality like ajax, animation, flex layout, and gestures. Encapsulation of complicated browser APIs and CSS layouts. UI component renderers such as accordions, cards, and sidebars.
This image guide shows the 4 architectural layers of Polymer.js web developerment.

Creating a Polymer Application

To get started, there are some articles and tutorial write-ups that help introduce you to Polymer, its concepts, and its structure. But if you’re anything like me, when you’ve gone through them and are ready to actually build your application, you quickly realize that you’re really not quite sure where to start or how to create it. Since I’ve now gone through the process and figured it out, here are some pointers…
Polymer web development is all about creating elements, and is only about creating elements. So, consistent with the Polymer world view, our application is going to be… a new element. Nothing more and nothing less. Oh OK, I get it. So that’s where we start.
For our Polymer project example, I’ll name the top-level element of the application , since custom element names (regardless of what framework you use to create them) must include a hyphen (e.g., x-tags, polymer-elements, etc.).
The next step, though, requires quite a bit more thought. We need to decide on how we are going to componentize our application. An easy approach is just to try to identify, from a visual perspective, the components in our application and then try to create them as custom elements in Polymer.
So for example, imagine that we have an app with the following screens:
This tutorial image depicts three Polymer.js web development screens in action.
We can identify that the top bar and the side bar menu are not going to change and the actual “content” of the app could load different “views”.
That being the case, one reasonable approach would be to create the element for our app and, inside that element, we could use some Polymer UI Elements to create the top bar and the side bar menu.
We can then create our two main views, which we’ll call ListView and the SingleView, to be loaded into the “content” area. For the items in the ListView, we can create an ItemView.
This will then yield a structure something like this:
This is a demo of an example Polymer.js structure.

The Good News

Now that we have our example Polymer application, we can insert it into any web page just by importing our “toptal-app.html” and adding the tag because, after all, our app is just an element. That’s cool.
In fact, therein does lie much of the power and beauty of the Polymer paradigm. The custom elements you create for your application (including the top-level one for your entire application) are treated as any other element in a web page. You can therefore access their properties and methods from any other JavaScript code or library (e.g., Backbone.jsAngular.js, etc.). You can even use those libraries to create your own new elements.
Moreover, your custom components are compatible with other custom element libraries (such as Mozilla’s X-Tag). So it doesn’t matter what you use to create your own custom element, it is compatible with Polymer and any other browser technology.
It’s therefore not surprising that we’ve already started to see the advent of a community of Element Creators that expose and share their newly created elements in forums like the Custom Elements site. You can go there and grab whatever component you need and just use it in your application.

On the other hand…

Polymer is still sufficiently new technology that developers, especially novice app developers, are likely to find it to be somewhat brittle, with a number of not-so-hard-to-find rough edges.
Here’s a sampling:
  • Lack of documentation and guidance.
    • Not all Polymer.js UI and non-UI Elements are documented. Sometimes the only “guidance” on how to use them is demo code. In some cases, it’s even necessary to refer to the source code of a Polymer Element to understand better how it works and can/should be used.
    • It’s not entirely clear how to organize larger applications. In particular, how are you supposed to pass singleton objects between elements? Which strategy should you employ to test your custom elements? Guidance on these types of issues at this point is scant at best.
  • Dependency errors and version-itis. Even when you download Polymer.js elements as is recommended, you might find yourself with a dependency error, pointing to different version dependencies in the same element. While it is understood that Polymer Elements are currently under heavy development, these kind of problems can make development quite challenging, eroding developer confidence and interest.
  • Problems on mobile platforms. Polymer.js performance on mobile platforms can often be somewhere between frustrating and problematic.
    • Downloading the entire library and polyfills (without gzip’ing) is slow, and you need to download every Polymer Element that you intend to use.
    • Processing the polyfills, libraries, and custom elements appears to be an expensive task on mobile platforms. Even when the downloading is complete, you still often have a blank screen for a few seconds.
    • Especially for more complex functionality (such as drag-and-drop or rendering into a canvas), you may find that functionality that works fine on the desktop simply does not work properly, or is not yet supported, on the mobile platform. In my particular case, one such frustration that I encountered was with rendering into a canvas on iOS.
  • Inadequate or confusing error reporting. Sometimes when you misspell an attribute name, or just break something related to the core layer itself, you receive a strange error message on your console with two call stacks that you need to investigate to try determine where the problem is. Sometimes it’s easy to solve this, but sometimes you end up needing to try a completely different strategy just to avoid the error since you can’t track down it’s source.


Polymer is intriguing technology, but it is undeniably still in its infancy. As such, it’s not yet well suited to development of a large, enterprise level, production-ready application. Additionally, there aren’t many guides or tutorials available specific to Polymer.js web development.
As far as whether the JavaScript-centric or DOM-centric approach is really fundamentally better, the jury is still out. Polymer makes some convincing arguments, but counter-arguments do exist.
Perhaps most notably, Polymer requires a fairly significant level of expertise in using browser technologies like the DOM. You’re in many ways returning to the days before jQuery, learning the DOM API to do simple tasks like adding or removing a CSS class from an element. This certainly does make it feel, at least on some level, like you’re taking a step backward rather than forward.
But that said, it does appear likely that custom elements are going to be an important part of Web Development in the future, so diving in sooner rather than later would probably be prudent for today’s web developer. And if you’ve never tried to create your own custom elements before, Polymer (and this tutorial) is probably a sensible place to start.
This article originally appeared on Toptal


Some of the coolest visualizations in the programming dev/test world

Here are a few interesting visualizations that I found while doing research for my talk about testing Insights, also included are the ones that I did not end up using:

  • Open source contributions by location 
  • GitHut - is an attempt to visualize and explore the complexity of the universe of programming languages used across the repositories hosted on GitHub.
  • Who speaks what on GitHub?

  • Visualization 1 is a chord diagram, which indicates the relationship between all possible combinations of programming languages. This data was computed by creating all possible pairs that could be created using the list of 20 languages I have analyzed. By analyzing the combinations, and the number of users that speak both of the languages in question, we get a good idea of what languages are spoken most, but also which languages are 'spoken' quite a lot, but not in combination. It gives a different perspective of the user-language landscape on GitHub.

    Visualization 2 makes direct use of the structure of the MySQL database I described in the section above. It allows you to search for a particular username and find out which languages this users speaks. While not very revolutionary, it is a very natural and logical way to query the data I obtained.

    Visualization 3 is the exact inverse of the second visualization. It offers you the capability of finding users that speak a given combination of languages. This may be useful if you're looking for a specific skillset for a project, and are looking for someone to help you out.

    • Community Over the past year, GitHub partnered with, held, and sponsored events all over the world. At Patchworks we watched new developers learn how to use Git. Our ConnectHome partnership provided low-cost internet access for families living in HUD-assisted housing. Sponsoring events like Rails Girls and hosting our own conferences allowed us to meet more GitHub users than ever before.

    • Newcomers This year GitHub grew by more than 5.2 million users and 303K organizations. We have more new students, developers, and businesses using GitHub than ever before.

    • Organizations With almost 80M total Pull Requests on GitHub, we know that 85% of all requests for change come from within organizations.

  • Tabs or spaces. We are going to parse every file among all programming languages known by GitHub to decide which one is on top.

  • Programming language associations - Mapping organizations with projects on GitHub to their respective programming languages
  • Analyzing emotions in texts based on the occurrence of expressions 

  • Amount of profanity in git commit messages per programming language

StackOverflow questions tagged v/s  Ranking the popularity of programming languages

Projects using the fork to pull paradigm

Happy visualizing!


BDD Guidelines - writing features - gherkin language

Some of us here are working on the BDD guidelines that should be followed:
Would be interested to hear if anyone has thoughts to share:


  1. Explain in the feature file what the feature is about, just after the “Feature:” before proceeding to the scenarios (preferably in “In order/As a/I want” format).
  2. Write high-level scenario steps by raising the level of abstraction and focus on the “what” rather than the “how”
    • don’t mention UI elements
    • don’t mention ‘click’ or other actions linked to specific UI elements
    • the scenario should remain valid if the UI is replaced with a new UI tomorrow
    • avoid very detailed steps where possible (helps to focus and avoid clutter)
  3. Write scenarios using business language rather than using technical language so that it can be understood by everyone.
  4. Write scenarios from the perspective of the person who will use or needs the feature (not always a member or user). Use 'I' and explain who the 'I' is in 'Given' step. 
  5. Each and every scenario needs to be independent, i.e. scenarios should not depend on other scenarios or data created by other scenarios.
  6. Don't mention features or actions which are not related to the actual feature under test.
    • Example: Scenario is to check the balance is displayed currently in a user account. Before checking the balance, user needs to login to check the balance but Login is not part of the check balance scenario.
  7. Use Scenario Outline when you have several scenarios that follow exactly the same pattern of steps, just with different input values or expected outcomes.
  8. Scenario Outline examples should be easy to understand. Column names should be meaningful (e.g. | Contact Method | Enquiry Type | instead of | value1 | value2 | ), so that the steps are understandable.
  9. Scenarios need to be environment independent.
  10. Write scenarios for testing happy paths and important error cases.
  11. Avoid typos and always use grammatically correct English.


  1. Wherever possible steps should be reused. This can also be achieved by
    • parameterizing (if applicable)
    • keeping it short and simple
  2. Whenever a feature or behaviour changes the existing scenarios needs to be updated (from a latest branch in TFS)
  3. Actions, which are not directly related to the scenario should be handled outside the actual scenario: 
    • Actions like login should be handled part of Background or hooks.
    • Setup and teardown should be part of hooks and should not be part of the scenario.

Bad Example 1

Scenario Outline: Detect agent type based on contract number (single contract found)
Given I am on the "Find me" page
And I have entered a contract number
When I click the "Continue" button
And a contract number match is found
And the agent type is 
Then the contract number field will become uneditable
And the "Back" button will be displayed
And the following  and  will be displayed

| DistributorType | input field type | text                            |
| Broker          | Date of birth    | Please enter your last name     |
| TiedAgent       | Last name        | Please enter your date of birth |

Good Example 1

Same test as in Bad Example 1, but with focus on a single business rule and without UI stuff
Scenario: Customer has a broker policy so DOB is requested
Given I have a "Broker" policy
When I submit my policy number
Then I should be asked for my date of birth

Scenario: Customer has a tied agent policy so last name is requested
Given I have a "TiedAgent" policy
When I submit my policy number
Then I should be asked for my last name

Scenario Outline Example

Scenario Outline: Withdraw fixed amount
Given I have  in my account
When I choose to withdraw the fixed amount of 
Then I should receive  cash
And the balance of my account should be 
| Balance | Withdrawal | Received | Remaining |
| $500    | $50        | $50      | $450      |
| $500    | $100       | $100     | $400      |
| $500    | $200       | $200     | $300      |


Help a fellow tester

I don't know Sathish in person but from what I read here he is trying to raise 60,000 Rs or $1200




New Mind-mapping software - MindMup 2.0 For Google Drive

I don't have to introduce to you what a mind-map is.  I am a big fan of mindmaps and this is a cool addition to the utilities.

The biggest change:  Your mind maps are stored in Google's cloud infrastructure, so you can use them from any device and any location.

and it's FREE : Store unlimited mind maps for free on Google Drive!!



Gist - What is paste-bin?

Gist a little bit!

What is gist?

Did you know that GitHub also operates other services: a pastebin-style site called Gist
You can share single files, parts of files, or full applications. You can access gists at https://gist.github.com.
Or discover many gists at https://gist.github.com/discover

More on gists here: https://help.github.com/categories/gists/

So that brings us to our next question:

What is pastebin?

In simple words a pastebin is a type of github where users can store only plain text.
Some Trivia: Pastebins were developed in the late 1990s to facilitate Internet Relay Chat

Source: https://en.wikipedia.org/wiki/Pastebin


Group multiple GitHub repositories by keyword or tag

Problem: If you are using GitHub and have multiple repositories , you surely want to organize them.
How do I group related github repositories in a folder structure?
Is there a feature providing any ability to order and structure or even tag repositories on github?

The answer is No! Unless you want to use the GitHub Organizations

But there is a Solution:

You can use gitrep (external oauth app) which allows you to organize starred repos using the concept of tagging. 

Gitrep also helps you to do this: 

"How to best find and compare different open source options. Typically it is not easy to answer questions like "What is the most popular jQuery Instagram library?" or "What Ruby gems are similar to Devise but newer?".

This is where Gitrep aims to help.

Gitrep allows you to search repositories by community created tags and their descriptions, along with apply personal tags that you can use for your own personal organization. You can also see repositories that are similarly tagged, along with see "users who star this repository typically also star these repositories" type relations."

Source: https://www.gitrep.com/

Happy World Tester's Day!

It's that day of the year again!

Surely every day is Tester's Day, but it feels good to have a special day that can make us testers around the world share and collaborate and feel proud to be one.
On September, 9 1945 the scientists of the Harvard University while testing the computer Mark II Aiken Relay Calculator had found a moth which got stuck between the contacts of the electromechanical relay. 

The work they performed required some description, and the word had been found – «debugging» (literally: disposal of an insect) – and now it is used to describe the process of identifying and eliminating bugs which cause a computer to malfunction. The removed insect was pasted into the computer log with the entry: “First actual case of bug being found”, and was then transferred to the computer museum.

Now that you know, Happy Tester’s day !


Learn PowerShell

Here is an excellent read and an amazing YouTube playlist:

Effective Windows PowerShell: The Free eBook


Learn Windows PowerShell in a Month of Lunches


Happy Scripting!


Really cool Programming Competency Matrix

Here is an amazing snapshot of a programming competency matrix at different levels:
Computer Science
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
data structuresDoesn't know the difference between Array and LinkedListAble to explain and use Arrays, LinkedLists, Dictionaries etc in practical programming tasksKnows space and time tradeoffs of the basic data structures, Arrays vs LinkedLists, Able to explain how hashtables can be implemented and can handle collisions, Priority queues and ways to implement them etc.Knowledge of advanced data structures like B-trees, binomial and fibonacci heaps, AVL/Red Black trees, Splay Trees, Skip Lists, tries etc.
algorithmsUnable to find the average of numbers in an array (It's hard to believe but I've interviewed such candidates)Basic sorting, searching and data structure traversal and retrieval algorithmsTree, Graph, simple greedy and divide and conquer algorithms, is able to understand the relevance of the levels of this matrix.Able to recognize and code dynamic programming solutions, good knowledge of graph algorithms, good knowledge of numerical computation algorithms, able to identify NP problems etc.
systems programmingDoesn't know what a compiler, linker or interpreter isBasic understanding of compilers, linker and interpreters. Understands what assembly code is and how things work at the hardware level. Some knowledge of virtual memory and paging.Understands kernel mode vs. user mode, multi-threading, synchronization primitives and how they're implemented, able to read assembly code. Understands how networks work, understanding of network protocols and socket level programming.Understands the entire programming stack, hardware (CPU + Memory + Cache + Interrupts + microcode), binary code, assembly, static and dynamic linking, compilation, interpretation, JIT compilation, garbage collection, heap, stack, memory addressing...
Software Engineering
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
source code version controlFolder backups by dateVSS and beginning CVS/SVN userProficient in using CVS and SVN features. Knows how to branch and merge, use patches setup repository properties etc.Knowledge of distributed VCS systems. Has tried out Bzr/Mercurial/Darcs/Git
build automationOnly knows how to build from IDEKnows how to build the system from the command lineCan setup a script to build the basic systemCan setup a script to build the system and also documentation, installers, generate release notes and tag the code in source control
automated testingThinks that all testing is the job of the testerHas written automated unit tests and comes up with good unit test cases for the code that is being writtenHas written code in TDD mannerUnderstands and is able to setup automated functional, load/performance and UI tests
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
problem decompositionOnly straight line code with copy paste for reuseAble to break up problem into multiple functionsAble to come up with reusable functions/objects that solve the overall problemUse of appropriate data structures and algorithms and comes up with generic/object-oriented code that encapsulate aspects of the problem that are subject to change.
systems decompositionNot able to think above the level of a single file/classAble to break up problem space and design solution as long as it is within the same platform/technologyAble to design systems that span multiple technologies/platforms.Able to visualize and design complex systems with multiple product lines and integrations with external systems. Also should be able to design operations support systems like monitoring, reporting, fail overs etc.
communicationCannot express thoughts/ideas to peers. Poor spelling and grammar.Peers can understand what is being said. Good spelling and grammar.Is able to effectively communicate with peersAble to understand and communicate thoughts/design/ideas/specs in a unambiguous manner and adjusts communication as per the context
code organization within a fileno evidence of organization within a fileMethods are grouped logically or by accessibilityCode is grouped into regions and well commented with references to other source filesFile has license header, summary, well commented, consistent white space usage. The file should look beautiful.
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
code organization across filesNo thought given to organizing code across filesRelated files are grouped into a folderEach physical file has a unique purpose, for e.g. one class definition, one feature implementation etc.Code organization at a physical level closely matches design and looking at file names and folder distribution provides insights into design
source tree organizationEverything in one folderBasic separation of code into logical folders.No circular dependencies, binaries, libs, docs, builds, third-party code all organized into appropriate foldersPhysical layout of source tree matches logical hierarchy and organization. The directory names and organization provide insights into the design of the system.
code readabilityMono-syllable namesGood names for files, variables classes, methods etc.No long functions, comments explaining unusual code, bug fixes, code assumptionsCode assumptions are verified using asserts, code flows naturally - no deep nesting of conditionals or methods
defensive codingDoesn't understand the conceptChecks all arguments and asserts critical assumptions in codeMakes sure to check return values and check for exceptions around code that can fail.Has his own library to help with defensive coding, writes unit tests that simulate faults
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
error handlingOnly codes the happy caseBasic error handling around code that can throw exceptions/generate errorsEnsures that error/exceptions leave program in good state, resources, connections and memory is all cleaned up properlyCodes to detect possible exception before, maintain consistent exception handling strategy in all layers of code, come up with guidelines on exception handling for entire system.
IDEMostly uses IDE for text editingKnows their way around the interface, able to effectively use the IDE using menus.Knows keyboard shortcuts for most used operations.Has written custom macros
APINeeds to look up the documentation frequentlyHas the most frequently used APIs in memoryVast and In-depth knowledge of the APIHas written libraries that sit on top of the API to simplify frequently used tasks and to fill in gaps in the API
frameworksHas not used any framework outside of the core platformHas heard about but not used the popular frameworks available for the platform.Has used more than one framework in a professional capacity and is well-versed with the idioms of the frameworks.Author of framework
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
requirementsTakes the given requirements and codes to specCome up with questions regarding missed cases in the specUnderstand complete picture and come up with entire areas that need to be specedAble to suggest better alternatives and flows to given requirements based on experience
scriptingNo knowledge of scripting toolsBatch files/shell scriptsPerl/Python/Ruby/VBScript/PowershellHas written and published reusable code
databaseThinks that Excel is a databaseKnows basic database concepts, normalization, ACID, transactions and can write simple selectsAble to design good and normalized database schemas keeping in mind the queries that'll have to be run, proficient in use of views, stored procedures, triggers and user defined types. Knows difference between clustered and non-clustered indexes. Proficient in use of ORM tools.Can do basic database administration, performance optimization, index optimization, write advanced select queries, able to replace cursor usage with relational sql, understands how data is stored internally, understands how indexes are stored internally, understands how databases can be mirrored, replicated etc. Understands how the two phase commit works.
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
languages with professional experienceImperative or Object OrientedImperative, Object-Oriented and declarative (SQL), added bonus if they understand static vs dynamic typing, weak vs strong typing and static inferred typesFunctional, added bonus if they understand lazy evaluation, currying, continuationsConcurrent (Erlang, Oz) and Logic (Prolog)
platforms with professional experience12-34-56+
years of professional experience12-56-910+
domain knowledgeNo knowledge of the domainHas worked on at least one product in the domain.Has worked on multiple products in the same domain.Domain expert. Has designed and implemented several products/solutions in the domain. Well versed with standard terms, protocols used in the domain.
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
tool knowledgeLimited to primary IDE (VS.Net, Eclipse etc.)Knows about some alternatives to popular and standard tools.Good knowledge of editors, debuggers, IDEs, open source alternatives etc. etc. For e.g. someone who knows most of the tools from Scott Hanselman's power tools list. Has used ORM tools.Has actually written tools and scripts, added bonus if they've been published.
languages exposed toImperative or Object OrientedImperative, Object-Oriented and declarative (SQL), added bonus if they understand static vs dynamic typing, weak vs strong typing and static inferred typesFunctional, added bonus if they understand lazy evaluation, currying, continuationsConcurrent (Erlang, Oz) and Logic (Prolog)
codebase knowledgeHas never looked at the codebaseBasic knowledge of the code layout and how to build the systemGood working knowledge of code base, has implemented several bug fixes and maybe some small features.Has implemented multiple big features in the codebase and can easily visualize the changes required for most features or bug fixes.
knowledge of upcoming technologiesHas not heard of the upcoming technologiesHas heard of upcoming technologies in the fieldHas downloaded the alpha preview/CTP/beta and read some articles/manualsHas played with the previews and has actually built something with it and as a bonus shared that with everyone else
2n (Level 0)n2 (Level 1)(Level 2)log(n) (Level 3)
platform internalsZero knowledge of platform internalsHas basic knowledge of how the platform works internallyDeep knowledge of platform internals and can visualize how the platform takes the program and converts it into executable code.Has written tools to enhance or provide information on platform internals. For e.g. disassemblers, decompilers, debuggers etc.
booksUnleashed series, 21 days series, 24 hour series, dummies series...Code Complete, Don't Make me Think, Mastering Regular ExpressionsDesign Patterns, Peopleware, Programming Pearls, Algorithm Design Manual, Pragmatic Programmer, Mythical Man monthStructure and Interpretation of Computer Programs, Concepts Techniques, Models of Computer Programming, Art of Computer Programming, Database systems , by C. J Date, Thinking Forth, Little Schemer
blogsHas heard of them but never got the time.Reads tech/programming/software engineering blogs and listens to podcasts regularly.Maintains a link blog with some collection of useful articles and tools that he/she has collectedMaintains a blog in which personal insights and thoughts on programming are shared