
Iterator
The Videos
The Code
https://github.com/diegopacheco/java-pocs/tree/master/pocs/simple-iterator
Cheers,
Diego Pacheco
The Videos
The Code
https://github.com/diegopacheco/java-pocs/tree/master/pocs/simple-iterator
Cheers,
Diego Pacheco
Dependency Management Best Practices
Often Best practices are depending on contexts. Once you have complexity several times, best practices do not necessarily translate from one company to another; however, there are cases where they apply and make sense, not always, but for Dependency management IMHO, these are Golden Rules and excellent practices to be followed. Here are some Dependency Management best practices:
Dependency management is not only about using some specific or better tools. It's about a process and culture which requires attention and automation as well. Let's do a deep dive into each of these practices and understand why they are important.
Use the Dependency management tool (ant, maven, gradle). Do Explicit Dependency management.
It might sound obvious but it's not uncommon to see infrastructure projects downloading binaries manually and not using explicit dependency management via tools like Ant/ivy, Maven, and Gradle. No Matter the language you use, no matter if is an engineering or DevOps code, you should do explicit dependency management. Because is easier to maintain and we can relly on common process and tools for improvements and housekeeping.
Explicit dependency management means, explicit defining dependencies on a file that is used by the dependency management tool. You should avoid having embedded dependency and scripts who download dependencies outside of your main tool.
Use Artifact Management Solution (Nexus, Archiva, Artifactory) - Management but Mainly: Cache
Software tends to grow with your business. As you grew build times can get very slow. A cache is a must-have feature. Because there are multiple engineers downloading artifacts from the web and you often have multiple cloud environments like DEV, STAGING, STREE, PROD, etc... Dependency management solutions like Nexus, Archiva, Artifactory, also can help with better dependency management but one of the main benefits is to have a central cache and central repository management/location.
Remove dependencies you dont use.
This might sound silly. But un-used dependencies are bad as Dead Code because they make upgrade efforts harder and they end up creating technical debt. It's not uncommon that dependencies have 3rd party dependencies too and you might be dependency from a 3rd party dep instead from a direct dep and have that scenario for a dep you dont use is bad. It's unclear, confusing, raise false positives, and makes reasoning about refactoring efforts much much harder.
Use Consistency Versioning (MAJOR, MINOR, SEC/PATCH).
Versioning is something old as the snakes in the jungle(like we use to say in Brazil). However people still dont get it right. Why? People know when to use MAJOR(Major API Breaking change), Minor(Minor change no breaking backward compatibility), and Security/Patch release (minor bug fixe or security patch, not impact). But people do not do it. Why? Most of the time is a combination of lack of discipline and lack of ownership and pain of upgrading people dependencies. Central teams can be great in sense of reducing some costs but certainly, they hide some of the pains that if people would face them directly the would definitely deal with the problem differently. Having consistent versioning is super important, for Design, for Testing, and for health engineering practice I would argue. This part requires discipline and every single binary should embrace this principle.
Do not use multi-project EXTERNAL POMS.
Don't be fooled by the word POM. This principle works for any dependency management tool. You should not share multi-project external configs for dependency management. Either you have a monolith or monorepo where you have all code in one place or if you do have multiple github repositories you should not share these files(poms). Because? Well because they are EVIL. They create coupling they make upgrades harder and they kill microservices.
If you will have shared libraries they should be:
* Small
* Independent
* Isolated (dont have poms, not share configs)
Otherwise, you will build a distributed monolith and binary coupling will prevent you from upgrade when you need it. Never trade coupling for convenience or developer experience. However, if you have a monolith or a monorepo is perfectly fine to share poms.
Keep Dependencies up to Date (But does not update at deploy time - Immutable Infrastructure)
Another super important practice is to keep your dependencies updates. Thats important for several reasons such as:
* Prevent Bugs
* Fix Security Bugs
* Reduce Tech Debt
Update Dependencies often works with the same principles as Branches in Configuration Management. If you gonna have a long-lived branch(which you should avoid at all costs) you need to do merges every day so it reduces the complexity and issues on an old fashion bing bang boom merge. Libraries updates work in the same way. For minor, security patches, even minors should be able to upgrade it easily.
DevOps has a principle called - Immutable Infrastructure, you do not want to upgrade libs before doing a deploys or when a service restart. Because that breaks the principle of immutable infrastructure. However, at the same time, you want to AUTOMATE your dependency management and update libs frequently. Often engineers do not have the mindset to keep updating libs, which can be fixed with proper plugins and automation.
Use Dependencies Carefully (Shared-Libs) avoid coupling as much as you can.
When we ship internal shared libraries we need to be very careful. Shared Libs should be treated by applying the same principles we apply for Services. It's super important to pay extra attention to 3rd party deps in shared libs in order to avoid binary coupling. It's fine to use shared-libs, sometimes thats the right solution, however, there is a huge abuse of internal shared libs on the technology industry.
Better Dependency management helps Design and Testing. It makes CI/CD more effective and in the long-run increases the speed and ability to ship better and more frequent software. Currently, we live in an era where every company is trying to do proper CI/CD, Observability, Services, DevOps, SRE, and many other important matters however we often forget dependency management is an important sub-part of Building that end ups charging a high price at scale.
Cheers,
Diego Pacheco
DDD is a nutshell
DDD(Domain Driven Design) is about comunication. Comunication between the business and the business experts(not business proxies) and the engineers(the folks who build the solution). DDD's main interest is in the CORE Domain of the problem you want to solve. In order to explore the comunication between business experts and engineers, we should focus on a common language which for DDD is called "Ubiquitous language". DDD Has many concepts and patterns such as:
* Domain: Logical area that defines your problem. i.e: Retail, HR, purchase, etc.
* Bounded Context: Logical Boundary on the code for the solution domain, can be defined as:
* Organization
* Code Base
* Database Schemas
* Services and Microservices
* Layered Architecture: Separated Core Domain from UI, Persistence, DBs.
* Entities: Domain Objects defined by unique id (UUID). i.e: User, Customer, Job, Message.
* Value Objects: Unchangeable Objects, has attributes but not unique ID. i.e: Name, JobTitle, Address.
* Aggregates: Cluster Value Objects into aggregates to define a boundary. One entity should be the root of the aggregate. So external objects hold a reference on the root only.
* Factories: Use to create complex objects and aggregates. The client does not need to know the internal details.
* Domain Events: Record Discrete event to model activity within a system.
* Services: Significant process or transformation into the domain. When is not the natural responsibility of the Entity or VO(Value Objects) objects.
* Repository: It's a service, uses a global interface to provide access to all entities and value objects within a particular aggregate collection.
* Context Map: Diagram to enforce strategy domain integrity.
* Bounded Context Patterns:
* Anti-Corruption Layer: Wrapper for Legacy API or protect the domain from bad API.
* Shared-Kernel: N-Bounded contexts depending on shared kernel(core).
* Customer/Supplier: Like a client/server has a high dependency.
* Conformist: UpStream and Downstream teams that ARE NOT ALIGNED (accept as it is).
* Partner: Mutual Dependency on both contexts and high alignment is needed for proper modeling.
In order to set the record straight, you dont need to use all patterns to be doing DDD. By the way, if you are using Spring Data and have Repositories that alone does not mean you are doing DDD. DDD is a process, it's not a one-time discrete shot. There is an agile collaborative modeling exercise called Event Storming which can help you a lot. Event Storming can be quite useful to figure out inconsistencies in the breakdown of your services and save lots of refactoring and headaches in the future.
DDD Benefits
DDD has solid benefits, such as:
* Better comunication
* Help Organization to retain knowledge (often a huge issue at scale and as the time pass)
* More flexibility via strong design and encapsulation
DDD will not do any magic for you. It's just a tool, at the end of the day it depends on how you use it.
DDD Challenges
Like anything in software, there are trade-offs. DDD might not work well at all in a very technical project. It's also required to have a domain expert working with the engineering team, which unfortunately is complex nowadays.
Reality and DDD
In order to organizations scale, they often split Strategy and Execution, Product and Technology. Specific people are often used as proxies to answer questions in sense of priorities and "representation of interests". That representation happens via PM(product Managers), PM(Project Manager), PO(Product Owner), BA(Bussiness analysts). There are good PMs, POs, and BAs out there but wich scale you need more and more, and unfortunately lots of times you have weak and low-performance in those positions.
When a startup starts by definition and nature the company is super glued with the customers, the problem, and the real needs. As a company succeeds and starts growing end ends ups as an enterprise that connection starts to get weak. DDD might be an important tool to help to promote not only comunication but also sharing an understanding of real business problems.
The Way forward
Recently, Uber rediscovered DDD and applied for they hyper-scale microservices solution, called DOMA(Domain Oriented Microservices Architecture). DDD has a perfect FIT with SOA and Microservices. It's a great modeling exercise and it's being used for decades in our industry. Analysis and Design are lost disciplines like DDD I hope we can rediscover this techniques and leverage and involve them in the present and the future.
Cheers,
Diego Pacheco
The Video
The Instructions
Cheers,
Diego Pacheco
The Video
The Code
https://github.com/diegopacheco/java-pocs/tree/master/pocs/java-15-fun
Cheers,
Diego Pacheco
Micro Frontends Benefits
One benefit from Micro Frontends is Developer Experience. IMHO That's the weakest argument and DX should not be the main reason to do anything in tech. IMHO the real benefits are:
* Improve Teams Topologies - Reduce the comunication blast radius
* Reduce management Overhead (by reducing coordination)
* Faster Build times by smaller and focused components
* Technology Diversity (Being able to work with different frameworks)
The frontend is getting bigger and complex as time pass. We cannot ignore this issue. Technical debt is a big problem at the backend and is becoming a real issue in frontend as frameworks evolve faster and believe me or not is hard to find AngularJS engineers. Micro frontends address a real issue in a sense of scalability and the need to introduce better and newer technology as big companies have issues in refactoring huge code bases. However like any solution, there are drawbacks and issues, so let's take a look into some micro-frontends issues.
Micro Frontends issues
There are lots of challenges and potential issues with micro frontends such as:
* Operational Complexity - Deploys are more complex
* Performance - Might affect the user experience (slower load time)
* UX/UI Consistency - Need to implement DS for all different frameworks
* Tooling / Framework support - IMHO it will be fixed but is a huge issue right now
The idea of Micro Frontends is to have independent teams. This can ship things faster, some teams might be faster than others, and in order to do that, you need a different code base and different deployment pipeline. Performance might be a problem since you might load different frameworks and different versions even I would say(you want to avoid this actually).
Design Systems often are implemented at the component level, so if you have components in one language like Angular and another team want to use React they might need to re-implement the components which might be a big no go for enterprises trying to adopt micro frontends.
If you can make the Design System implementation more CSS driven rather than framework driven you can minimize this impact, however is very hard to fix completely. A Design system requires the exact same behavior and duplicating to N technologies(Vue, Angular, React, JQueryy/VanillaJS) is expensive since you have fonts, pixels, error, and lots of small details. It's possible but still hard.
Tooling is a big issue, there is no problem tooling around. There are some solutions with better tooling but often that means by double down into a specific framework. I believe the tooling problem will be fixed eventually. Zalando has an interesting solution called Mosaic. There is also OC. However, the stand way to go looks like to be around Web Components. It's also possible to do something reasonable with AWS and Simple Infrastructure.
Runtime issues
One big difference from the backend to the frontend and mobile is that in backend we can run the software on different servers so isolation is totally possible. Frontend/mobile has a common issue wihc would be the phone or the browser. It's possible to use shell-like methods like iFrames, Web-Components, or rely on specific frameworks solutions but at the end of the day, the code runs into the same place, which is the browser. It's possible to load different frameworks and solutions but there are performance and complexity penalties. Lots of companies have already frontends monoliths and one common approach people do is to use the monolith as a shell and they have links to the micro frontends apps.
It's about granularity or Modularization?
IMHO we don't want to make the components that micro. One hard lesson we learn at the backend is that things too micro also have lots of issues. IMHO what really matters is the ISOLATION and modularization. So you definitely want to have a modular JS application but not necessarily everything needs to be at the micro-level. Micro not always is the right level of abstraction and that is one of the things who killed microservices.
There is a huge difference in a Brownfield and Greenfield project and a shared old monolith application. IMHO React is the best web framework we have it right now, if you have a new project and create proper components I would say you dont need micro-frontends. However, looking to a more complex enterprise where you have all sorts of solutions I would say you definitely need to have some solution for modularization.
Having a micro frontend solution imply you to have some solution for the following concerns:
* SSR (Server Side Rendering) VS CSR (Client Side Rendering)
* Cache
* Layout assembling or linking
* Routing
* Testing
It's ideal to have the tooling and good solutions around those concerns. It's possible to do Micro Fronentds without tooling for the concerns I mention however keep in mind it might be less productive, more complex, and much harder to test than good old monolith frontend apps. IMHO Micro frontends might be making the same mistakes as Microservices by focusing on the "micro". What matters is some level of isolation(given the browser reality) and modularization.
Cheers,
Diego Pacheco