Total Posts:26|Showing Posts:1-26
Jump to topic:

Singleton Pattern is an Anti-pattern

Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/21/2015 10:56:12 PM
Posted: 1 year ago
No love in the technology forum. I'll try here for one last attempt.

The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

The Open-closed principle: open for extension, closed for modification. Has lead to the advice that Global variables be avoided in all cases because the variable is not protected from modification by other objects.

This pattern was created to solve the problem of needing one instantiation of an object that controls concurrent access to a shared resource from multiple parts.

However, this pattern causes problems. When the class is referenced, it searches for an instance of a singleton object. If no object is found it creates a singleton object.

This essentially is to make a global variable protected from modification. But in the fast-paced, multi-noded environment we are working in today, it's possible for 2 or more objects to be created and no way of reconciling them.

The use of a command pattern and a factory pattern seems the best solution to control the number of instances, queue requests, and redirect requests to a single object.

Also singleton are lazily implemented...

I think this is a rather open-shut case. But just curious if there is anyone who will debate good coding practices.
RainbowDash52
Posts: 294
Add as Friend
Challenge to a Debate
Send a Message
3/21/2015 11:21:25 PM
Posted: 1 year ago
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
No love in the technology forum. I'll try here for one last attempt.

The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

The Open-closed principle: open for extension, closed for modification. Has lead to the advice that Global variables be avoided in all cases because the variable is not protected from modification by other objects.

This pattern was created to solve the problem of needing one instantiation of an object that controls concurrent access to a shared resource from multiple parts.

However, this pattern causes problems. When the class is referenced, it searches for an instance of a singleton object. If no object is found it creates a singleton object.

This essentially is to make a global variable protected from modification. But in the fast-paced, multi-noded environment we are working in today, it's possible for 2 or more objects to be created and no way of reconciling them.

The use of a command pattern and a factory pattern seems the best solution to control the number of instances, queue requests, and redirect requests to a single object.

Also singleton are lazily implemented...

I think this is a rather open-shut case. But just curious if there is anyone who will debate good coding practices.

I would love to debate you on this. I am too busy to debate right now, but if you are interested I can debate you in a couple weeks.
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/21/2015 11:25:08 PM
Posted: 1 year ago
At 3/21/2015 11:21:25 PM, RainbowDash52 wrote:
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
No love in the technology forum. I'll try here for one last attempt.

The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

The Open-closed principle: open for extension, closed for modification. Has lead to the advice that Global variables be avoided in all cases because the variable is not protected from modification by other objects.

This pattern was created to solve the problem of needing one instantiation of an object that controls concurrent access to a shared resource from multiple parts.

However, this pattern causes problems. When the class is referenced, it searches for an instance of a singleton object. If no object is found it creates a singleton object.

This essentially is to make a global variable protected from modification. But in the fast-paced, multi-noded environment we are working in today, it's possible for 2 or more objects to be created and no way of reconciling them.

The use of a command pattern and a factory pattern seems the best solution to control the number of instances, queue requests, and redirect requests to a single object.

Also singleton are lazily implemented...

I think this is a rather open-shut case. But just curious if there is anyone who will debate good coding practices.

I would love to debate you on this. I am too busy to debate right now, but if you are interested I can debate you in a couple weeks.

Well that would be interesting. but I was hoping to spark a discussion in the forum instead of actually finding a debate.

But I feel strongly enough about it I wouldn't mind debating you.
RainbowDash52
Posts: 294
Add as Friend
Challenge to a Debate
Send a Message
3/21/2015 11:37:59 PM
Posted: 1 year ago
At 3/21/2015 11:25:08 PM, Mhykiel wrote:
At 3/21/2015 11:21:25 PM, RainbowDash52 wrote:
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
No love in the technology forum. I'll try here for one last attempt.

The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

The Open-closed principle: open for extension, closed for modification. Has lead to the advice that Global variables be avoided in all cases because the variable is not protected from modification by other objects.

This pattern was created to solve the problem of needing one instantiation of an object that controls concurrent access to a shared resource from multiple parts.

However, this pattern causes problems. When the class is referenced, it searches for an instance of a singleton object. If no object is found it creates a singleton object.

This essentially is to make a global variable protected from modification. But in the fast-paced, multi-noded environment we are working in today, it's possible for 2 or more objects to be created and no way of reconciling them.

The use of a command pattern and a factory pattern seems the best solution to control the number of instances, queue requests, and redirect requests to a single object.

Also singleton are lazily implemented...

I think this is a rather open-shut case. But just curious if there is anyone who will debate good coding practices.

I would love to debate you on this. I am too busy to debate right now, but if you are interested I can debate you in a couple weeks.

Well that would be interesting. but I was hoping to spark a discussion in the forum instead of actually finding a debate.

But I feel strongly enough about it I wouldn't mind debating you.

I am fine with just a discussion. I think the singleton pattern is useful because it is a better alternative to using static classes. Static classes have limitations such as not being able to extend another class, and the singleton pattern allows you to get around these limitations by having a non static class but still have the ability of preventing multiple instances.
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/21/2015 11:58:24 PM
Posted: 1 year ago
At 3/21/2015 11:37:59 PM, RainbowDash52 wrote:
At 3/21/2015 11:25:08 PM, Mhykiel wrote:
At 3/21/2015 11:21:25 PM, RainbowDash52 wrote:
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
No love in the technology forum. I'll try here for one last attempt.

The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

The Open-closed principle: open for extension, closed for modification. Has lead to the advice that Global variables be avoided in all cases because the variable is not protected from modification by other objects.

This pattern was created to solve the problem of needing one instantiation of an object that controls concurrent access to a shared resource from multiple parts.

However, this pattern causes problems. When the class is referenced, it searches for an instance of a singleton object. If no object is found it creates a singleton object.

This essentially is to make a global variable protected from modification. But in the fast-paced, multi-noded environment we are working in today, it's possible for 2 or more objects to be created and no way of reconciling them.

The use of a command pattern and a factory pattern seems the best solution to control the number of instances, queue requests, and redirect requests to a single object.

Also singleton are lazily implemented...

I think this is a rather open-shut case. But just curious if there is anyone who will debate good coding practices.

I would love to debate you on this. I am too busy to debate right now, but if you are interested I can debate you in a couple weeks.

Well that would be interesting. but I was hoping to spark a discussion in the forum instead of actually finding a debate.

But I feel strongly enough about it I wouldn't mind debating you.

I am fine with just a discussion. I think the singleton pattern is useful because it is a better alternative to using static classes. Static classes have limitations such as not being able to extend another class, and the singleton pattern allows you to get around these limitations by having a non static class but still have the ability of preventing multiple instances.

The thing that makes a class a singleton pattern is the self instantiating method. This creates problems in mulch-threaded environments and others issues.

And as I suggested with the use of a factory and command pattern then instantiation can be kept singular and opens a way to modify instantiation and request tracking.

And the use of static is that they can be called by name without instancing an object. The benefit of static classes, methods ect.. is that they are executed strait from the class instead of the after constructing an object.

So the use of static doesn't necessarily imply a fix to instantiation. But the control of instances of objects is directly covered under different factory patterns and that is where it should be.
RainbowDash52
Posts: 294
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 12:24:03 AM
Posted: 1 year ago
At 3/21/2015 11:58:24 PM, Mhykiel wrote:
At 3/21/2015 11:37:59 PM, RainbowDash52 wrote:
At 3/21/2015 11:25:08 PM, Mhykiel wrote:
At 3/21/2015 11:21:25 PM, RainbowDash52 wrote:
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
No love in the technology forum. I'll try here for one last attempt.

The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

The Open-closed principle: open for extension, closed for modification. Has lead to the advice that Global variables be avoided in all cases because the variable is not protected from modification by other objects.

This pattern was created to solve the problem of needing one instantiation of an object that controls concurrent access to a shared resource from multiple parts.

However, this pattern causes problems. When the class is referenced, it searches for an instance of a singleton object. If no object is found it creates a singleton object.

This essentially is to make a global variable protected from modification. But in the fast-paced, multi-noded environment we are working in today, it's possible for 2 or more objects to be created and no way of reconciling them.

The use of a command pattern and a factory pattern seems the best solution to control the number of instances, queue requests, and redirect requests to a single object.

Also singleton are lazily implemented...

I think this is a rather open-shut case. But just curious if there is anyone who will debate good coding practices.

I would love to debate you on this. I am too busy to debate right now, but if you are interested I can debate you in a couple weeks.

Well that would be interesting. but I was hoping to spark a discussion in the forum instead of actually finding a debate.

But I feel strongly enough about it I wouldn't mind debating you.

I am fine with just a discussion. I think the singleton pattern is useful because it is a better alternative to using static classes. Static classes have limitations such as not being able to extend another class, and the singleton pattern allows you to get around these limitations by having a non static class but still have the ability of preventing multiple instances.

The thing that makes a class a singleton pattern is the self instantiating method. This creates problems in mulch-threaded environments and others issues.

And as I suggested with the use of a factory and command pattern then instantiation can be kept singular and opens a way to modify instantiation and request tracking.

And the use of static is that they can be called by name without instancing an object. The benefit of static classes, methods ect.. is that they are executed strait from the class instead of the after constructing an object.

So the use of static doesn't necessarily imply a fix to instantiation. But the control of instances of objects is directly covered under different factory patterns and that is where it should be.

I am unfamiliar with mulch-threading, so I can't comment on that. (I am a recently graduated CS major with Java and C# experience, to give you an idea of my knowledge/experience level is). I am having trouble understanding how a factory pattern would replace a singleton, since factory pattern is used so that you can dynamically select a type, but singleton just has one instance of an already determined type. Could you explain the command pattern with factory pattern combination in more detail?
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 12:25:34 AM
Posted: 1 year ago
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

This is interesting, Mhykiel, and I might be on your side for once.

Are you saying it should never be used? Used sparingly? Used with modifications?
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 12:49:10 AM
Posted: 1 year ago
At 3/22/2015 12:24:03 AM, RainbowDash52 wrote:
At 3/21/2015 11:58:24 PM, Mhykiel wrote:
At 3/21/2015 11:37:59 PM, RainbowDash52 wrote:
At 3/21/2015 11:25:08 PM, Mhykiel wrote:
At 3/21/2015 11:21:25 PM, RainbowDash52 wrote:
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
No love in the technology forum. I'll try here for one last attempt.

The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

The Open-closed principle: open for extension, closed for modification. Has lead to the advice that Global variables be avoided in all cases because the variable is not protected from modification by other objects.

This pattern was created to solve the problem of needing one instantiation of an object that controls concurrent access to a shared resource from multiple parts.

However, this pattern causes problems. When the class is referenced, it searches for an instance of a singleton object. If no object is found it creates a singleton object.

This essentially is to make a global variable protected from modification. But in the fast-paced, multi-noded environment we are working in today, it's possible for 2 or more objects to be created and no way of reconciling them.

The use of a command pattern and a factory pattern seems the best solution to control the number of instances, queue requests, and redirect requests to a single object.

Also singleton are lazily implemented...

I think this is a rather open-shut case. But just curious if there is anyone who will debate good coding practices.

I would love to debate you on this. I am too busy to debate right now, but if you are interested I can debate you in a couple weeks.

Well that would be interesting. but I was hoping to spark a discussion in the forum instead of actually finding a debate.

But I feel strongly enough about it I wouldn't mind debating you.

I am fine with just a discussion. I think the singleton pattern is useful because it is a better alternative to using static classes. Static classes have limitations such as not being able to extend another class, and the singleton pattern allows you to get around these limitations by having a non static class but still have the ability of preventing multiple instances.

The thing that makes a class a singleton pattern is the self instantiating method. This creates problems in mulch-threaded environments and others issues.

And as I suggested with the use of a factory and command pattern then instantiation can be kept singular and opens a way to modify instantiation and request tracking.

And the use of static is that they can be called by name without instancing an object. The benefit of static classes, methods ect.. is that they are executed strait from the class instead of the after constructing an object.

So the use of static doesn't necessarily imply a fix to instantiation. But the control of instances of objects is directly covered under different factory patterns and that is where it should be.

I am unfamiliar with mulch-threading, so I can't comment on that. (I am a recently graduated CS major with Java and C# experience, to give you an idea of my knowledge/experience level is). I am having trouble understanding how a factory pattern would replace a singleton, since factory pattern is used so that you can dynamically select a type, but singleton just has one instance of an already determined type. Could you explain the command pattern with factory pattern combination in more detail?

A quick Google search landed me here, and i think the article is sufficient.
http://misko.hevery.com...

Lol, That's funny, I meant multi-threaded applications. But problems in synchronization occur in other ways. Oracle has this article outlining a long list of them.
http://www.oracle.com...

A simple example would be in the construction method of the class, if a current instance is not found, then a request is sent to a command object. And then that command object has a queue of requests. The command object establishes whether a new instance needs to be made and if so sends it to a factory. Reconciling any synchronization problems.

The factory can make any object needed for any purpose that a singleton might be used.

One place I can see Singletons being used is in loggers. Depending on situation we may want to avoid lazy implementation. And testing becomes an issue with the tight coupling of the singleton pattern, which may or may not be a concern for logging data in the future.
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 12:58:20 AM
Posted: 1 year ago
At 3/22/2015 12:25:34 AM, RuvDraba wrote:
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

This is interesting, Mhykiel, and I might be on your side for once.

Are you saying it should never be used? Used sparingly? Used with modifications?

I think as taught with the constructor class being able to create a new object, that should Never be used.

I think if complexity, experience, and time are a concern then use one with modifications to the constructor as I briefly wrote about.

Ideally though, It shouldn't need to be used at all. If one is using an Abstract factory then the instantiation logic is hidden. It is then delegated to a command pattern and a factory pattern to construct objects as needed or to send the object reference back the requester.

So I would say it falls into a very niche use that is better solved by other patterns. If no one got taught what a singleton was no biggy.

But people would still make them as shortcuts, and so teaching them as a likely pitfall would be the best approach in my mind.
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 1:25:33 AM
Posted: 1 year ago
At 3/22/2015 12:25:34 AM, RuvDraba wrote:
At 3/21/2015 10:56:12 PM, Mhykiel wrote:
The Singleton pattern in object orientated programing, I consider an anti-pattern. Systems today, and for the near future, will be using more parallel computing and run across a network on different platforms.

This is interesting, Mhykiel, and I might be on your side for once.

Disagreeing on Intelligent Design is a poor data point to use for the probability of how much we agree or disagree in general. Take UnRealDeniability, it would not shock me if we agree on many more things than disagree. Web development is a growing and relatively new field about 30 years. And web developers research unrelated fields, perform tests, statistically analyze results, and come up with novel solutions to balance security, accessibility, usability, ect..

I know i can be abrasive at times, I get online in the forums during my down time when I am tired and cranky. usually because it is before dinner as well. But i try to remind myself that you and others may disagree with me on point A, but agree with me in other areas. And that is okay.

As long as we can support our positions with reason and logic, we can find common ground. Even change a mind or two.

I thought Singleton as an anti-pattern was a rather popular idea and well substantiated. I'm just excited to have someone respond and disagree.

I've tutored people to pass college programing classes and have worked with people right out of college. Some of the things I do and ways I describe things, like that simplified list of describing computer systems, I don't see in the repertoire of new graduates.

If you thought this was interesting and I get some more responses maybe I will share more of my thoughts on subjects like this.

My preferred method for translatable page elements is a different implementation than I usually see. I prefer using an XML file for the strings, a XSL file for page structure and formatting that is then styled by CSS.

I like it cause one file is associated with a language and translators using translation programs can easily generate new language dictionaries. HTML structure is separated from style and it accommodates fairly easily read directions.

these are simple examples i think lend themselves to discussion in a forum like this.


Are you saying it should never be used? Used sparingly? Used with modifications?
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 2:47:14 AM
Posted: 1 year ago
At 3/22/2015 1:25:33 AM, Mhykiel wrote:

Disagreeing on Intelligent Design is a poor data point to use for the probability of how much we agree or disagree in general.

Yes. I think you have an analytic temperament, and are methodical to boot. We'd probably agree on an approach to most things, if not always conclusions. :)

I'm currently making dinner for Mrs Draba. (A Spanish-style rolled shoulder of lamb with olives, tomatoes, red wine, verjuice, garlic, rosemary; patatas bravas -- roast potatoes in a spicy tomato sauce; spinach and artichoke salad, and spanish-style coleslaw with green peppers and mustard-vinegar dressing.) Depending on how much tempranillo is consumed with dinner, will review, reflect and reply later. :)
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 6:42:27 AM
Posted: 1 year ago
At 3/22/2015 12:58:20 AM, Mhykiel wrote:
I would say it falls into a very niche use that is better solved by other patterns. If no one got taught what a singleton was no biggy.

By way of disclosure, the last time I got my hands dirty on Object Oriented Design (OOD) was maybe 20 years ago, and that was a research project rather than a business-critical industry system. These days I play in information strategy, architecture and assurance, but at the enterprise level, namespace and state decisions are important, and that's my angle on the design question.

Thinking about your comment, Mhykiel, and reading up on it, I found a fun rant from Steve Yegge in 2004. [https://sites.google.com...] Aside from making me grin (as it's been too long since I've enjoyed an impassioned developer-rant), he talks through the developer's thinking in detail, and shows how it's easy to get trapped. Anyway, his argument on the slippery slope of losing key OO benefits, and the subsequent risk of refactoring made sense to me, and here's why...

At the enterprise level, I'm struck by how much stateful code written for a single system needs to be migrated to stateless later on, often to be wrapped and included as part of a Service Oriented Architecture. This often happens in large enterprises as purpose-built designs get refactored for multiple lines of business, or point solutions written for separate data stores must merge as the data themselves are merged. Assuming the software were pure OO (not always true), and built in-house (varies), the refactoring and re-architecting costs can get serious... And while this may or may not entail multithreading, it can produce other issues where you need to rethink namespace management and stateful behaviours. So while the idea of having only a single stateful class instance can make perfect sense in a point-solution investment, it might make less sense if you're sucking service requests from an Enterprise Service Bus and rebalancing load.

Regarding multithreading itself, I agree that the demand is likely to increase as hardware speed improvements plateau. However, I think its impact is limited because you usually (should) know in advance how far your processing speeds will have to scale, and these days, Machine Instructions per Second (MIPS) and Floating-point operations per second (FLOPS) are seldom the real scaling limit. For most business usage, it's bandwidth, memory or both, and that's seldom solved by multithreading. Moreover, multithreading isn't the only way to scale processing capacity (often it's servers), and often the systems requiring the fastest processing speeds aren't written OO anyway.

So, my take is a qualified agreement for different reasons, Mhykiel: like Steve, I'm concerned about refactoring -- especially as line-of-business systems become enterprise services. That could just be a reflection of the industry sector I work in, but I think it should also be true for any industry sectors affected by integration and consolidation (which seems early all of them.) It seems to me that at best, a singleton pattern is a trap for inexperienced developers. At worst, singleton designs might be future refactor risks more safely done a different way.

Hope that may be useful.
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 1:41:58 PM
Posted: 1 year ago
At 3/22/2015 6:42:27 AM, RuvDraba wrote:
At 3/22/2015 12:58:20 AM, Mhykiel wrote:
I would say it falls into a very niche use that is better solved by other patterns. If no one got taught what a singleton was no biggy.

By way of disclosure, the last time I got my hands dirty on Object Oriented Design (OOD) was maybe 20 years ago, and that was a research project rather than a business-critical industry system. These days I play in information strategy, architecture and assurance, but at the enterprise level, namespace and state decisions are important, and that's my angle on the design question.

Thinking about your comment, Mhykiel, and reading up on it, I found a fun rant from Steve Yegge in 2004. [https://sites.google.com...] Aside from making me grin (as it's been too long since I've enjoyed an impassioned developer-rant), he talks through the developer's thinking in detail, and shows how it's easy to get trapped. Anyway, his argument on the slippery slope of losing key OO benefits, and the subsequent risk of refactoring made sense to me, and here's why...


Haha Funny rant but so true. As you might know with programming, you should never say never do this. And the singleton is more like a vice wrench than a right tool for the job. I personally don't use them and when I do they are with some modifications, and meant to be removed latter on.

At the enterprise level, I'm struck by how much stateful code written for a single system needs to be migrated to stateless later on, often to be wrapped and included as part of a Service Oriented Architecture. This often happens in large enterprises as purpose-built designs get refactored for multiple lines of business, or point solutions written for separate data stores must merge as the data themselves are merged. Assuming the software were pure OO (not always true), and built in-house (varies), the refactoring and re-architecting costs can get serious... And while this may or may not entail multithreading, it can produce other issues where you need to rethink namespace management and stateful behaviours. So while the idea of having only a single stateful class instance can make perfect sense in a point-solution investment, it might make less sense if you're sucking service requests from an Enterprise Service Bus and rebalancing load.


That's a good point, I was directing my post to programmers and project managers. And we could just talk for hours on migration and growth. get stateless earlier, good comments in code, no hard coded variables, memory caching, logging, ect.. If the skeleton of the system is not adaptable to growth and migration then it is a huge cost.

And one of those sore points is the use of singleton patterns. And I said earlier people are going to use them but should be taught the slippery slope and pit falls of them.

Regarding multithreading itself, I agree that the demand is likely to increase as hardware speed improvements plateau. However, I think its impact is limited because you usually (should) know in advance how far your processing speeds will have to scale, and these days, Machine Instructions per Second (MIPS) and Floating-point operations per second (FLOPS) are seldom the real scaling limit. For most business usage, it's bandwidth, memory or both, and that's seldom solved by multithreading. Moreover, multithreading isn't the only way to scale processing capacity (often it's servers), and often the systems requiring the fastest processing speeds aren't written OO anyway.


While you still may have needs for a concurrent state at the enterprise level the platform will probably not be using singleton or OOP patterns anyways. One thing we do see trending and I think will continue to grow, is stateless apps from multiple devices at the user level. Making bottom tier OOP design patterns and principles more prevalent.

There will be a lot more small developer and community generated apps accessing services from the enterprise systems. Such as with small games and websites using google+ and facebook for log in.

So, my take is a qualified agreement for different reasons, Mhykiel: like Steve, I'm concerned about refactoring -- especially as line-of-business systems become enterprise services. That could just be a reflection of the industry sector I work in, but I think it should also be true for any industry sectors affected by integration and consolidation (which seems early all of them.) It seems to me that at best, a singleton pattern is a trap for inexperienced developers. At worst, singleton designs might be future refactor risks more safely done a different way.

Hope that may be useful.

It's an interesting addition. It is certainly always good to keep future proofing in mind when coding. It's hard to get anything ideal. Balancing performance, cost, and time. But when it comes to the use of singletons I do not see the benefits outweighing the cons in 99% of cases.
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 2:11:30 PM
Posted: 1 year ago
At 3/22/2015 1:41:58 PM, Mhykiel wrote:
It is certainly always good to keep future proofing in mind when coding. It's hard to get anything ideal. Balancing performance, cost, and time. But when it comes to the use of singletons I do not see the benefits outweighing the cons in 99% of cases.

I understand and agree. To me it looks like the sort of thing you need to get the solution architect to nail down in coding and review standards, then make sure any lead developers know about it (assuming your solution architect is any good to start with.)

To be honest though, I don't see many big projects using pure OO. (By big, I mean hundred-million to half-billion dollar investments.) Where I play, OO is often used to wrap and interface legacy and non-OO solutions. Or it sits in the presentation and orchestration layers, while business logic and data are done procedurally. So the OO purism about namespaces, abstraction and specialisation tends to break on the boundaries anyway, and the stateful, multithreading, memory-leaking questions may depend more on whatever else you're integrating than on how you wrap it -- though of course it still counts.
UndeniableReality
Posts: 1,897
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 2:28:43 PM
Posted: 1 year ago
At 3/22/2015 2:11:30 PM, RuvDraba wrote:
At 3/22/2015 1:41:58 PM, Mhykiel wrote:
It is certainly always good to keep future proofing in mind when coding. It's hard to get anything ideal. Balancing performance, cost, and time. But when it comes to the use of singletons I do not see the benefits outweighing the cons in 99% of cases.

I understand and agree. To me it looks like the sort of thing you need to get the solution architect to nail down in coding and review standards, then make sure any lead developers know about it (assuming your solution architect is any good to start with.)

To be honest though, I don't see many big projects using pure OO. (By big, I mean hundred-million to half-billion dollar investments.) Where I play, OO is often used to wrap and interface legacy and non-OO solutions. Or it sits in the presentation and orchestration layers, while business logic and data are done procedurally. So the OO purism about namespaces, abstraction and specialisation tends to break on the boundaries anyway, and the stateful, multithreading, memory-leaking questions may depend more on whatever else you're integrating than on how you wrap it -- though of course it still counts.

This is what I've seen as well. Interestingly though, OOP and very high-level languages (Matlab, R, for computational programming) are common in R&I for prototyping as well, and solutions are then often translated into C by some layer of the dev team. So it seems to have its place on both ends, just not so much in the meaty middle of the final product.
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 2:43:34 PM
Posted: 1 year ago
At 3/22/2015 2:11:30 PM, RuvDraba wrote:
At 3/22/2015 1:41:58 PM, Mhykiel wrote:
It is certainly always good to keep future proofing in mind when coding. It's hard to get anything ideal. Balancing performance, cost, and time. But when it comes to the use of singletons I do not see the benefits outweighing the cons in 99% of cases.

I understand and agree. To me it looks like the sort of thing you need to get the solution architect to nail down in coding and review standards, then make sure any lead developers know about it (assuming your solution architect is any good to start with.)

To be honest though, I don't see many big projects using pure OO. (By big, I mean hundred-million to half-billion dollar investments.) Where I play, OO is often used to wrap and interface legacy and non-OO solutions. Or it sits in the presentation and orchestration layers, while business logic and data are done procedurally. So the OO purism about namespaces, abstraction and specialisation tends to break on the boundaries anyway, and the stateful, multithreading, memory-leaking questions may depend more on whatever else you're integrating than on how you wrap it -- though of course it still counts.

I prefer REST by way of XML (not exactly SOAP) served by Cobol/CICS. I see so many developers moving the data to their preferred platform. But a single thread of truth principle means keep the data in the mainframe. As mentioned before the rise in small non-related developers making apps, enterprise will be in the business to provide services. So I suggest making the use of services as easy as possible. Simple stateless requests served up in XML objects or files.

Any disagreement there?
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 2:44:55 PM
Posted: 1 year ago
At 3/22/2015 2:28:43 PM, UndeniableReality wrote:
it seems to have its place on both ends, just not so much in the meaty middle of the final product.

Yes. There's a lot of attraction in going from high-level design to something that will actually run: the more you can do this through incremental detail, the more accountable is the traceability and the more informative the testing.

However, at some point you hit a relational database, or a Commercial, Off-the-Shelf component, or it just runs too slowly due to too many levels of abstraction, and that's when you have the C-monkeys and similar proceduralise it. :)

And, for all the attractions of OOA&D, I've seen few Real People (i.e. business customers) warm to it. You show them pictures of processes, and boxes with joined up conceptual information, they get it. Show them swimlanes, and their eyes glaze. Show them a class diagram representing their business, and they show you the door.

This matters, because it breaks the traceability. What they're signing off isn't your class diagrams, but the pictures you abstracted them to. So someone in the design team (or else the project manager) had better be good at turning pictures into class-objects and vice-versa, or else the one project is telling two different stories. :)
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 3:27:17 PM
Posted: 1 year ago
At 3/22/2015 2:43:34 PM, Mhykiel wrote:

I prefer REST by way of XML (not exactly SOAP) served by Cobol/CICS. I see so many developers moving the data to their preferred platform. But a single thread of truth principle means keep the data in the mainframe.

Ah. You're managing the legacy issue bigtime then. :)

If the data are already in one spot, then there's a strong case to keep them there. However... the industry is swinging from big, legacy in-house coded systems to integrating Commercial, off-the-Shelf (CotS) components, sometimes on different platforms. When that happens, you don't get much choice: the financials and customer data will be in some CotS, Enterprise Resource Planning system (ERP), while your operational, line-of-business data may be in multiple other systems, and then the single source of truth -- the information real people make real strategic and operational decisions about -- typically resides in none of these, but in some data warehouse, carefully protecting corporate data from the vicissitudes of system changes (at the cost of ludicrously expensive replumbing every few years. :D)

So I suggest making the use of services as easy as possible. Simple stateless requests served up in XML objects or files.

Yes -- unless you can't. :) There's some hidden context here, Mhykiel. If you're just updating customer contact info, say, you can see REST doing that very well -- simple operations on a single resource. But if you're doing anything that needs to expose business logic -- like an online bank transaction , or some system-to-system collaboration needing a two-phase commit (e.g. an operation to bundle inventory items and dispatch), you might need SOAP instead. But using REST by default is a popular position for a range of reasons it sounds like you already know. :)
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 4:25:26 PM
Posted: 1 year ago
At 3/22/2015 3:27:17 PM, RuvDraba wrote:
At 3/22/2015 2:43:34 PM, Mhykiel wrote:

I prefer REST by way of XML (not exactly SOAP) served by Cobol/CICS. I see so many developers moving the data to their preferred platform. But a single thread of truth principle means keep the data in the mainframe.

Ah. You're managing the legacy issue bigtime then. :)

If the data are already in one spot, then there's a strong case to keep them there. However... the industry is swinging from big, legacy in-house coded systems to integrating Commercial, off-the-Shelf (CotS) components, sometimes on different platforms. When that happens, you don't get much choice: the financials and customer data will be in some CotS, Enterprise Resource Planning system (ERP), while your operational, line-of-business data may be in multiple other systems, and then the single source of truth -- the information real people make real strategic and operational decisions about -- typically resides in none of these, but in some data warehouse, carefully protecting corporate data from the vicissitudes of system changes (at the cost of ludicrously expensive replumbing every few years. :D)

So I suggest making the use of services as easy as possible. Simple stateless requests served up in XML objects or files.

Yes -- unless you can't. :) There's some hidden context here, Mhykiel. If you're just updating customer contact info, say, you can see REST doing that very well -- simple operations on a single resource. But if you're doing anything that needs to expose business logic -- like an online bank transaction , or some system-to-system collaboration needing a two-phase commit (e.g. an operation to bundle inventory items and dispatch), you might need SOAP instead. But using REST by default is a popular position for a range of reasons it sounds like you already know. :)

First Data does world wide merchant account transaction and that is using Cobol. There are some places in which data is moved from the mainframe to oracle and then used by Java. My first question of course is why? Cobol/CICS can do all those actions. There are red books available that detail using CICS for web services and implementing routing based on the httpd spec.

And what is business logic? Business logic is logic that constructs a database query or how to array results (and though this is for presentation, It really should be ONLY to apply structure). So this does not need be as fat as another platform.

The nature of the landscape with Cots I think is direct to human perception. Like when you were talking about trying to sell the project to a client, sometimes it comes to putting it in a manner they can relate to more than the plan that is best for them. I can't tell you how often the client doesn't know what "Can" be done for them and what they sometimes think they can't change and just deal with it. I've found offices taking hours to copy from spreadsheets instead of asking for a sorting function (no laugh but I am serious).

And then of course there are the buzz words. but I say it is not if a architecture is the right solution but sometimes if it is perceivable by the humans making the decision. this presence of Cots I think is because managers want to own their data and own their branch. And they don't see that separation if their data is stored in a community pool. But this is a presentation problem not a back end requirement.

I of course disagree with the term legacy when applied to mainframe. It's derisive lol :)
The mainframe isn't going anywhere. Not because it is too big to fail, but because it is relevant.
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 4:59:14 PM
Posted: 1 year ago
At 3/22/2015 4:25:26 PM, Mhykiel wrote:
what is business logic? Business logic is logic that constructs a database query or how to array results (and though this is for presentation, It really should be ONLY to apply structure). So this does not need be as fat as another platform.

Business logic is the operationalisation of business decisions and process control. It can be implemented in part by queries, but can actually be very complex.

I of course disagree with the term legacy when applied to mainframe. It's derisive lol :)
The mainframe isn't going anywhere. Not because it is too big to fail, but because it is relevant.

I concur. For certain compute-intensive cases (like computational science) it's perfect (though scalable parallelism can be better in niche uses too.)

The legacy issue isn't the hardware: it's what's done on the mainframes, and how well that adapts to our changing world.

To pick a really ugly example, one of our clients -- a large Australian government agency administering $4B per year -- developed its core business software in 1983. At that time, a relational database wasn't available, and so it used a hierarchical database, with all the challenges that poses in representing and maintaining many-to-many relationships. This system isn't just business-critical, it's social-security-critical, in the sense that if it fails, Australia's social security system grinds to a halt.

Obviously, the sheer scale and reliability demands made mainframe a good choice at the time, and so it has proven.

But as you can imagine, the software and platforms, have aged significantly in 32 years, and the world has moved on too. Services have devolved to multiple agencies, jurisdictions; customers are encouraged to seek Web-based self-service rather than shopfront service; service and payments may eventually be outsourced to payment companies and private providers; agencies are more connected; assurance is now intelligently risk-managed; documents are now recorded electronically; and increasingly, customers will be carrying key personal data on or around their person.

That's not to say that none of this could be implemented on a mainframe, but there are serious legacy issues in trying to adapt a central-processing, central-data architecture to an increasingly devolved, decentralised, service-centric, joined-up, cloudified, mobile, smartphone-enabled world.

Plus there's the challenge of maintaining the skills, expertise and design knowledge in a system it has been estimated will cost $1B to replace.

I'm not suggesting mainframe technology itself is a legacy issue, but mainframe-dependance surely is. :)

(I won't even begin to talk about what happens when enterprises split, or have to merge, but financial institutions encounter this at times, and when your data are all in one bucket, and your processing is all intermingled on the same physical platform it's not exactly pretty. :D )
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 5:52:00 PM
Posted: 1 year ago
At 3/22/2015 4:59:14 PM, RuvDraba wrote:
At 3/22/2015 4:25:26 PM, Mhykiel wrote:
what is business logic? Business logic is logic that constructs a database query or how to array results (and though this is for presentation, It really should be ONLY to apply structure). So this does not need be as fat as another platform.

Business logic is the operationalisation of business decisions and process control. It can be implemented in part by queries, but can actually be very complex.

I of course disagree with the term legacy when applied to mainframe. It's derisive lol :)
The mainframe isn't going anywhere. Not because it is too big to fail, but because it is relevant.

I concur. For certain compute-intensive cases (like computational science) it's perfect (though scalable parallelism can be better in niche uses too.)

The legacy issue isn't the hardware: it's what's done on the mainframes, and how well that adapts to our changing world.

To pick a really ugly example, one of our clients -- a large Australian government agency administering $4B per year -- developed its core business software in 1983. At that time, a relational database wasn't available, and so it used a hierarchical database, with all the challenges that poses in representing and maintaining many-to-many relationships. This system isn't just business-critical, it's social-security-critical, in the sense that if it fails, Australia's social security system grinds to a halt.

Obviously, the sheer scale and reliability demands made mainframe a good choice at the time, and so it has proven.

But as you can imagine, the software and platforms, have aged significantly in 32 years, and the world has moved on too. Services have devolved to multiple agencies, jurisdictions; customers are encouraged to seek Web-based self-service rather than shopfront service; service and payments may eventually be outsourced to payment companies and private providers; agencies are more connected; assurance is now intelligently risk-managed; documents are now recorded electronically; and increasingly, customers will be carrying key personal data on or around their person.

That's not to say that none of this could be implemented on a mainframe, but there are serious legacy issues in trying to adapt a central-processing, central-data architecture to an increasingly devolved, decentralised, service-centric, joined-up, cloudified, mobile, smartphone-enabled world.

But you see you said decentralized and cloudified in the same sentence. Mainframe and enterprise business will be about providing services to a diverse client base. The bulk of business logic should be left to them.


Plus there's the challenge of maintaining the skills, expertise and design knowledge in a system it has been estimated will cost $1B to replace.

I'm not suggesting mainframe technology itself is a legacy issue, but mainframe-dependance surely is. :)

But we have ran into problems of maintaining separate store of redundant information and migrating changes downstream. A little bit of centralization isn't bad.


(I won't even begin to talk about what happens when enterprises split, or have to merge, but financial institutions encounter this at times, and when your data are all in one bucket, and your processing is all intermingled on the same physical platform it's not exactly pretty. :D )

So what is better? ad hocing adapters and integrating diverse off the shelf systems or splitting one platform. Considering the maintaining of a system happens more often than the division I would suggest establishing secure resources and standard communication protocols pays off more. And that doesn't necessarily mean centralized on a mainframe. But certainly the mainframe makes such a practice easier.

One thing is, we all ready have a protocol that is prevalent and world wide. HTTP and XMLHttpRequest. There is a decline in SOAP and the need is being replace by RESTful API. With calls being answered in JSON. This I think moves the right amount of business logic to the client using the service.
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 6:25:11 PM
Posted: 1 year ago
At 3/22/2015 5:52:00 PM, Mhykiel wrote:

you said decentralized and cloudified in the same sentence.

Yes, because a single business process could be broken across multiple providers, with the actual processing on diverse systems in diverse physical locations -- or even co-located in the same cloud for speed, yet managed under disparate responsibilities.

If that sounds odd, consider how an online retail firm presently inter-operates with say Paypal and DHL to fulfill your order. Now imagine the retailer bidding for your custom through a brokerage, and meanwhile putting the fulfilment contract up for a spot-price bid instead via an online delivery brokerage, while some loyalty scheme subsidises your purchase.

In that scenario, who owns the customer information? Who owns the business rules for pricing and fulfillment?

Mainframe and enterprise business will be about providing services to a diverse client base. The bulk of business logic should be left to them.

Enterprise =/= mainframe. Enterprise is delimited by responsibilities; not technology. And the way the business rules are calculated will depend on who has the data, how businesses collaborate and a range of external factors that the system designer cannot control.

But we have ran into problems of maintaining separate store of redundant information and migrating changes downstream. A little bit of centralization isn't bad.

I think it's essential, Mhykiel. But what is centralisation? Is it physical or logical?

And is there a circular argument here too? Having a mainframe is desirable because it's convenient to centralise data and processing. But then you must centralise data and processing regardless of whether that's good for business agility because you have a mainframe. So is that convenience after all?

So what is better? ad hocing adapters and integrating diverse off the shelf systems or splitting one platform.

It depends on what the business needs. I used to work for a system integrator, and the reason we had any work at all is that businesses had to glue CoTS together to do what they needed to do. The alternative -- bespoke builds or customisations on a single platform -- didn't make sense.

I'm not at all opposed to mainframes, but the ugliest and most covering-your-eyes-whimperingly intractable legacy issues I've ever seen have all been mainframe issues. :)
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 6:36:37 PM
Posted: 1 year ago
At 3/22/2015 6:25:11 PM, RuvDraba wrote:
At 3/22/2015 5:52:00 PM, Mhykiel wrote:

you said decentralized and cloudified in the same sentence.

Yes, because a single business process could be broken across multiple providers, with the actual processing on diverse systems in diverse physical locations -- or even co-located in the same cloud for speed, yet managed under disparate responsibilities.

If that sounds odd, consider how an online retail firm presently inter-operates with say Paypal and DHL to fulfill your order. Now imagine the retailer bidding for your custom through a brokerage, and meanwhile putting the fulfilment contract up for a spot-price bid instead via an online delivery brokerage, while some loyalty scheme subsidises your purchase.

In that scenario, who owns the customer information? Who owns the business rules for pricing and fulfillment?

Mainframe and enterprise business will be about providing services to a diverse client base. The bulk of business logic should be left to them.

Enterprise =/= mainframe. Enterprise is delimited by responsibilities; not technology. And the way the business rules are calculated will depend on who has the data, how businesses collaborate and a range of external factors that the system designer cannot control.

I wasn't equating them. I used the word and.


But we have ran into problems of maintaining separate store of redundant information and migrating changes downstream. A little bit of centralization isn't bad.

I think it's essential, Mhykiel. But what is centralisation? Is it physical or logical?

And is there a circular argument here too? Having a mainframe is desirable because it's convenient to centralise data and processing. But then you must centralise data and processing regardless of whether that's good for business agility because you have a mainframe. So is that convenience after all?

So what is better? ad hocing adapters and integrating diverse off the shelf systems or splitting one platform.

It depends on what the business needs. I used to work for a system integrator, and the reason we had any work at all is that businesses had to glue CoTS together to do what they needed to do. The alternative -- bespoke builds or customisations on a single platform -- didn't make sense.

I'm not at all opposed to mainframes, but the ugliest and most covering-your-eyes-whimperingly intractable legacy issues I've ever seen have all been mainframe issues. :)

Oh I'm not against gluing off the shelf stuff together to get a solution. I do think breaking stuff from mainframe when you have one is often pushed more by buzz words and trending preferences than by what is right solution.
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 7:17:29 PM
Posted: 1 year ago
At 3/22/2015 6:36:37 PM, Mhykiel wrote:
I'm not against gluing off the shelf stuff together to get a solution. I do think breaking stuff from mainframe when you have one is often pushed more by buzz words and trending preferences than by what is right solution.

It's right and proper to be skeptical about buzzwords and trends. It's also often a very good idea to keep skill-sets and technologies from getting too disparate, since the cost to maintain disparate systems can get painful over time.

But those aren't a technologist's decisions: they're actually business decisions about technology. And redefining the business problem to suit one's own skill-set is a known and problematic design bias -- as you might have noticed yourself, if you've ever seen (say) a relational database twisted to store objects in XML, or Lotus Notes cancer, Sharepoint melanoma, or galloping SAPulitis. :)
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 7:59:48 PM
Posted: 1 year ago
At 3/22/2015 7:17:29 PM, RuvDraba wrote:
At 3/22/2015 6:36:37 PM, Mhykiel wrote:
I'm not against gluing off the shelf stuff together to get a solution. I do think breaking stuff from mainframe when you have one is often pushed more by buzz words and trending preferences than by what is right solution.

It's right and proper to be skeptical about buzzwords and trends. It's also often a very good idea to keep skill-sets and technologies from getting too disparate, since the cost to maintain disparate systems can get painful over time.

But those aren't a technologist's decisions: they're actually business decisions about technology. And redefining the business problem to suit one's own skill-set is a known and problematic design bias -- as you might have noticed yourself, if you've ever seen (say) a relational database twisted to store objects in XML, or Lotus Notes cancer, Sharepoint melanoma, or galloping SAPulitis. :)

Well everything has it's place. I'm not one to shy away from what I am unfamiliar with. Some technologies are inadequate solutions because they no longer support the business environment.

But no worries 20 years from now a SOAP like protocol will reign king, it will be called something cool like ghostwhisper or evolvedpackets.

Like responsive webdesigns, are minimalism revamped. I can't believe I'm saying this but with the next generation of kids minecraft crazy pixelated icons will become a rage.. again.
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
3/22/2015 8:36:19 PM
Posted: 1 year ago
At 3/22/2015 7:59:48 PM, Mhykiel wrote:
I can't believe I'm saying this but with the next generation of kids minecraft crazy pixelated icons will become a rage.. again.

Too late...

http://store.steampowered.com...