Is it finally time for PHP generics?

Submitted by Larry on 13 May 2026 - 8:13pm

Earlier this week, Seifeddine Gmati, aka "ajezz," posted an RFC for PHP generics. It's the most robust and promising approach for that feature to come along to date, and it's generating a lot of buzz. The folks at ThePHP.cc asked me to blog about it so they had something to reference in their new PHPReads newsletter, so, here it is. :-)

The background

The first half of Seif's RFC is an absolutely superb and approachable history of static analysis and generics in PHP, making the case for why it's needed. If you want the full story, go read it. It's very approachable for anyone with a casual programming background.

For those that don't want to read even that, the short-short version is that generics are a way to make a class or function that can operate over many different types, but keeping that type consistent. So for example:

class Test<A>
{
    private A $value;
    
    public function doStuff(A $val): A { ... }
}

$x = new Test::<DateTimeImmutable>();
$y = new Test::<Product>();

In $x, any place A was listed will be replaced with DateTimeImmutable. In $y, any place A was listed will be replaced with Product. No need to write multiple classes.

This is actually a huge feature for a great many use cases, which the RFC goes into so I won't repeat it.

Different solutions

The catch is that, in practice, there's three ways to do generics, all of which have drawbacks:

  • Reified: Every object carries around generic type information at runtime, which gives you the maximum amount of precision and safety but can have significant performance impact.
  • Monomorphized: Every time a generic class is used, the language engine does a copy-paste of the class and substitutes in the provided type. This also can have significant performance implications, and is really hard to do in an interpreted language like PHP.
  • Erased: Once the compiler has done its thing, the generic type information is thrown away and doesn't exist at runtime. This has the least performance impact (zero at runtime), but can leave "gaps" in what you're able to do with the generic information.

Different languages take different approaches, and even within the options above there is variation. In practice, I think Erased is the most common, especially in compiled languages. They have the advantage that they are compiled, and that means there is a mandatory step where the compiler is able to look at the entire program at once and statically ensure the program's type information (generic and not) all lines up. Once that's known, the type information (or at least the generic information) is not useful anymore, so is safe to get rid of. Or, mostly, at least. It means that at runtime, it's not possible to determine if your function was passed a Test<DateTimeImmutable> or a Test<Product>. That may or may not be a problem.

But no good answer for scripting languages

But there's a catch for PHP: There is no whole-app compiler. PHP itself doesn't ever get a view of the "whole" application that it can use to determine if all the types line up. This is a problem any scripting language has, so it applies to PHP, Python, Ruby, Javascript, and all the rest.

The Javascript ecosystem solved this issue with TypeScript; TypeScript is a compiled language that happens to use Javascript as its compile target, rather than something more low-level. That means it does have the "whole app" view and can validate the types all line up, then throw that information away to produce Javascript.

Python solved it with fully-erased types. Python has a type syntax, but Python itself doesn't do anything with it. At all. Instead, 3rd party static analysis tools can examine an application and verify the types, if you remember to do so. But many (most? not sure) Python devs never do that, and never bother adding types, so they're little more than documentation with funny syntax.

PHP took the approach of runtime enforcement of types. It is the only major interpreted language where types are actually enforced, at runtime. If you type a parameter to take an int, and you pass a string to it, PHP will yell at you and throw a TypeError at runtime. Although this has a performance impact, it's generally been fairly small, and means that PHP can also have an incredibly robust Reflection system that knows all about the types of things.

The catch has always been that runtime enforcement has an upper-bound of complexity it can realistically handle, and Generics have always fallen above that line. As a result, PHP's static analysis community has picked up the slack, offering, essentially, the same experience that Python has, but via custom docblock tags: You can run a checker script (PHPStan, Psalm, Mago, etc.) on your own to validate generic comments, but at runtime none of that matters. It's just comments with funny syntax.

Erased...ish?

Seif's RFC is called "Bound Erased Generics," but IMO that's a bad name from a marketing perspective. It makes it sound like it's doing what Python does, which is what most people have meant by "erased generics" in the many previous discussions. But in truth, it's much more clever than that. In practice, it's enforcing most generic rules, because in practice most generic rules can be validated statically one-file at a time, or when "linking" the code. ("Linking," for PHP, mostly means verifying that a subclass obeys all the type rules it inherits from its parent, and happens as a separate step after compilation.)

Seif's argument is that is good enough, and the last 15% of generic validation can be handled by existing static analysis tools.

Seif maintains Mago, the youngest static analysis tool on the market. Daniil Gentili (maintainer of Psalm) and Ondřej Mirtes (maintainer of PHPStan) have publicly endorsed this approach. So have a number of other major developers. Several others, though, have expressed skepticism that the last 15% is safe to leave to 3rd party tools, and I count myself in that group.

"Compile" time

As I noted in a blog post for the PHP Foundation last year, there's a good chunk of generics functionality that could be done at compile time, even in PHP. Mainly, the "declaration" side of generics: Making sure a child class follows the generics rules of its parent class, etc. At the time, Gina Banyard was working on a subset of generics, "associated types," that would have no runtime impact. The feedback on that issue was decidedly mixed, and it hasn't had any public movement in a while.

This latest RFC takes that same idea and pushes it about as far as it can go. It's able to enforce:

  • Syntax (this is the easy one)
  • The rules of generics themselves
  • Default types
  • Variance (a generic type marked "input only" can't be used in an return type, and vice versa)
  • Classes that reference a generic interface, parent class, or trait must follow any rules that parent has for its types.
  • Generic calls and new object creation are partially enforced.

That's not a small chunk of generics. It also maintains the generic type information for reflection, so some runtime self-analysis is possible, and it can enforce generic rules that way if you want to get very obtuse in your code.

There's three main things it does not support:

class Group<T> {
    public function __construct(T $val) { ... }

    public function add(T $val) { ... }
}

$list = new Group::<User>(new Product());
$list->add(new Product());

This RFC would verify that there is only one type argument on Group, and that User is a legal type to use there. But it would not catch the fact that the value being passed into the constructor is not a User, nor to the add method.

function doStuff(Group<User> $group) { ... }

$group would be checked at runtime to be an instance of Group, but not that it's of User objects.

if ($g instanceof Group<User>) { ... }

This wouldn't even be legal syntax, as the object doesn't "know" what type it is for.

Another very, very important feature is that any generic calls are optional. If not provided, the engine will simply default to whatever restrictions were placed on the class itself (like T: Person, meaning any type that implements Person) or mixed if none were specified. So just calling

$g = new Group(new Product());

is legal. That makes it possible to add generics to a class without breaking all calls to that class. Callers of the class can add type information gradually. Frankly that's not getting as much attention as it should, as it makes the adoption process for generics vastly easier.

Is it enough?

This is where the main controversy is. The RFC makes the argument that static analysis tools are already enforcing generics, so anyone who cares about generics is already using static analysis tools, so plugging the gaps above with static analysis tools is fine. The tool authors have already said they'd jump on this new syntax to offer their existing validation logic on it, which is great... for those developers already using static analysis tools.

Here's the problem: We have no idea what the Venn diagram overlap is of "wants generics" and "uses a static analysis tool already" is. The RFC suggests it's a circle, but it's argument for it is also rather circular.

Moreover, whatever it is now will not be after generics are introduced. Within 2 years of generics being added to the language, the percentage of developers encountering and having to deal with generics will be north of 90%. I fully expect further PHP features and standard library additions to leverage generics, so they will be unavoidable.

But static analysis tools? I have no idea off hand how widespread they are. I use them, and I know many developers do, but I also know many developers who don't. That means they'll have language features that look like they work, but actually do not. It will be the first time that PHP has had "do nothing" syntax in the language itself. (Attributes are not do-nothing. They're there specifically as an extension point, not a feature unto themselves.)

The deciding factor

So that's the million dollar question, really: Is the percentage of developers that could get bitten by "do nothing" syntax because they don't use static analysis tools low enough to be acceptable?

If so, then this RFC is the best shot we've ever had at generics, and we should rally behind it. Aside from those explicitly stated gaps, it's an excellently written RFC, and aside from one syntax quirk (that is now a secondary vote) I like basically the whole thing. It is excellent work, and I have a whole host of places I would use it as soon as I'm able.

But if that percentage is not sufficiently low, we have a problem. That would mean some unknown number of users who could easily write unsafe code, thinking it was safe. It would then type error mysteriously on them, as there's no indication of "oh right, this is the do-nothing syntax."

One might argue that's acceptable; sloppy coders gonna sloppy code, we can't babysit everyone. That's valid, but the RFC also explicitly points out that it creates a baseline for further development. Could we add fully-enforced generics in the future, by adding reified generics, either fully or partially? Maybe. That's unclear, in part because of the performance question.

But suppose we were able to extend generics enforcement in the future. Any existing code that has a call-site generic type error (which wasn't caught because the developer wasn't using a static analysis tool) would become a runtime error.

Now, we could certainly state up front "your code was already bad, fix your broken code." PHP has done that in the past many times. The best example is PHP 8.0 promoting "undefined variable/key" notices to warnings, effectively removing the "missing value silently becomes null" feature from the language. Anyone who knew anything had already been fixing those for 15 years; Developing with E_ALL errors enabled to catch exactly that has always been the recommendation. Nevertheless, I spent a month cleaning up thousands of lines in a major open source project by adding ?? checks to it, because this 20+ year old codebase had never bothered to clean up that sloppy code. And they were using both PHPStan and Psalm! I fully expect the same would happen with generics.

I've written before that as a Free and free project, PHP doesn't actually "owe" anyone backward compatibility or even working code. But as I noted there, the project should still strive to not break people's code if it can be avoided, and by and large the PHP developers do put in that effort.

Conclusion

So how much of a landmine would "mostly enforced generics" be, in practice?

I really don't know. As of this writing, I am on the fence about this RFC for exactly that reason. I use static analysis tools, and would absolutely adopt generics the moment I get the opportunity, so I am arguably the exact target audience.

But as a project we still need to consider the impact of changes on non-target audiences. Dismissing bad code as bad code and therefore not worth supporting is fine... up to a point. PHP has spent 20 years working to remove gaps, footguns, and other places where sloppy code was previously permitted when it shouldn't be. I would hate to see us go backwards on that, especially at a time when PHP needs to work on improving its image.

I am tempted to issue a call for feedback from the community, but... that's never going to be a representative sample. It would probably only generate responses from people who use static analysis, so wouldn't have a problem.

I'll figure out how to vote on the RFC eventually, but for the moment, I'm still undecided. It's so close, and has such a strong upside. I want it. But it would be disingenuous to claim it doesn't also have non-trivial downsides we need to consider. And that's a balancing act that we're still working on.