Testing When Using Laravel's SoftDeletes

Chris Di Carlo • November 16, 2020

The other night I was trying to debug a rather irritating issue in an app where some tests would inexplicably pass or fail. There didn't seem to be any pattern - e.g. I could run the whole suite 10 times and have a single test method fail each time or I could run a single method and have it fail 2 out of 4 times. The only thing that seemed consistent was it had something to do with my tests around editing a model. Needless to say, it drove me crazy!

Well, after about 3 hours of debugging and troubleshooting, I stumbled upon the reason and honestly, it was a face palm moment. The issue: I was using softDeletes on this particular model and in the factory I had in my infinite wisdom decided to be "proactive" and write the following code to seed my database:

'deleted_at' => $this->faker->optional(0.5, null)->dateTimeBetween('-1 year'),

At first glance, nothing untoward there - just want to simulate having a random set of soft deleted records, right? And truthfully, there's nothing really wrong with the code as written per se - the issue stems from not fully comprehending the impact it has when testing and using the factory to generate the records.

Here's an example of one of the methods that was randomly failing. Pretty simple route existence test:

public function users_can_view_the_edit_form()
{
    $this->signIn();

    $client = Client::factory()->create();

    $response = $this->get('clients/'.$client->id.'/edit');
    $response->assertOk();
}

Written similar tests hundreds of times without any issues. So what's so special now? Is it because I'm now on Laravel 8? Using Jetstream as my boilerplate? Using Livewire and/or AlpineJS? The moon is positioned just so in the night sky and there's a crow sitting on the fence over yonder?

Um, no. In fact, it was me forgetting a most basic principle when dealing with soft deletes in Laravel.

It all boils down to me not remembering one tiny bit of information: soft deleted records are excluded from query results. And it's right in the docs, too; I just missed it in my straw grasping antics while trying to find a fix for the failing tests. It turns out that the tests were failing only when the model was being created with a populated deleted_at field; when it was null everything worked fine, hence the seemingly random failures. I removed that one line of code in the factory and BAM! - smooth as silk testing is back, baby!

The moral of the story - that line really should have been delegated to a factory state so that I could apply it when needed without mucking with the basic model.

Hopefully you haven't stumbled on this issue but if you have, hope this post helps!

Happy coding! Cheers!