How to install Pi-hole on your Synology NAS

Pi-hole is an open-source, network-wide ad-blocking software. It blocks advertisements, trackers, and unwanted content across your entire home network by intercepting and filtering DNS requests before they reach ad-serving domains.

Unlike browser extensions that work only on individual devices or browsers, Pi-hole operates at the network level, protecting all connected devices— without requiring any software installed on them.

How It Works

  • You set up Pi-hole on a device (typically a Raspberry Pi, but it can also run in Docker, a virtual machine, or other Linux systems).
  • Configure your router (or individual devices) to use Pi-hole as the DNS server.
  • When a device tries to load a webpage or app, it queries DNS for domain names (e.g., “ads.example.com”).
  • Pi-hole checks against blocklists (millions of known ad/tracker domains) and returns a “null” response for blocked ones, preventing ads from loading.
  • Legitimate requests pass through normally.

Key Benefits

  • Blocks ads in apps and places where traditional blockers can’t reach.
  • Improves privacy by stopping trackers.
  • Reduces bandwidth usage and speeds up browsing (fewer ads to load).
  • Provides a web dashboard for stats, query logs, and custom block/allow lists.
  • It can also serve as a DHCP server if needed.

Step-by-Step Instructions:

  1. Install the Container Manager on your Synology NAS. Developed by Docker and published by Synology.
  2. Create a shared Docker folder for storing your Docker containers.
  3. Inside the Docker folder, create a new folder and name it pihole.
  4. Find the absolute path of the folder created in step 3 by viewing the properties of the folder.
  5. In the pihole folder, created in step 3, create a new folder named etc-pinhole. (make the folder name  lowercase)
  6. In Container Manager, create a new project and name it pihole. Set the path to the pihole folder created in step 3, and select Create docker-compose.yaml as the source.
  7. Enter the following configuration information into the source box. Replace the volume paths with the path from step 4. The sample configuration shows /volume4/docker/pihole/ as an example; replace this with your path.
    # More info at https://github.com/pi-hole/docker-pi-hole/ and https://docs.pi-hole.net/
    services:
    pihole:
    container_name: pihole
    hostname: pihole
    image: pihole/pihole:latest
    ports:
    # DNS Ports
    - "53:53/tcp"
    - "53:53/udp"
    # Default HTTP Port
    - "8082:80/tcp"
    # Default HTTPs Port. FTL will generate a self-signed certificate
    #- "443:443/tcp"
    # Uncomment the below if using Pi-hole as your DHCP Server
    #- "67:67/udp"
    # Uncomment the line below if you are using Pi-hole as your NTP server
    #- "123:123/udp"
    environment:
    # Set the appropriate timezone for your location from
    # https://en.wikipedia.org/wiki/List_of_tz_database_time_zones, e.g:
    TZ: 'America/New_York'
    # Set a password to access the web interface. Not setting one will result in a random password being assigned
    FTLCONF_webserver_api_password: '<your_password>'
    # If using Docker's default `bridge` network setting the dns listening mode should be set to 'ALL'
    FTLCONF_dns_listeningMode: 'ALL'
    # Volumes store your data between container upgrades
    volumes:
    # For persisting Pi-hole's databases and common configuration file
    - /volume4/docker/pihole/etc-pihole:/etc/pihole
    # Uncomment the below if you have custom dnsmasq config files that you want to persist. Not needed for most starting fresh with Pi-hole v6. If you're upgrading from v5 you and have used this directory before, you should keep it enabled for the first v6 container start to allow for a complete migration. It can be removed afterwards. Needs environment variable FTLCONF_misc_etc_dnsmasq_d: 'true'
    #- './etc-dnsmasq.d:/etc/dnsmasq.d'
    #cap_add:
    # See https://github.com/pi-hole/docker-pi-hole#note-on-capabilities
    # Required if you are using Pi-hole as your DHCP server, else not needed
    # - NET_ADMIN
    # Required if you are using Pi-hole as your NTP client to be able to set the host's system time
    # - SYS_TIME
    # Optional, if Pi-hole should get some more processing time
    # - SYS_NICE
    restart: unless-stopped
  8. Click Next
  9. Click Next
  10. Click Done to start the installation.
  11. Once installation is complete, access your Pi-hole installation through the host address of your Synology NAS, port 8082 (specified in the compose YAML).

 

Note: There are a many configuration options that can be specified in the compose.yaml file. Refer to https://docs.pi-hole.net/docker/ for more information.

Permanent link to this article: https://www.dvlprlife.com/2026/01/how-to-install-pi-hole-on-your-synology-nas/

Quick Tips: Pin Tabs in VS Code

Welcome to Quick Tips — a fast, focused series designed to help you work smarter.

Each post will give you one practical insight you can apply immediately, whether you’re coding, configuring your tools, or improving your workflow.

Here’s today’s Quick Tip:

Pin Tab in VS Code

When you’re bouncing between files in Visual Studio Code, it’s easy for tabs to get replaced or closed as you navigate.

If there’s a file you want to keep open (and easy to find), VS Code has a simple feature for that: pinned tabs.

Pinned tabs always stay visible and won’t get “swept away” by common tab-closing actions. They’re perfect for:

  • A main file you reference often
  • A configuration file you don’t want to lose track of
  • A long-running script you’re actively editing

How to Pin a Tab

  • Right‑click the tab you want to keep open
  • Select Pin
  • Or use the keyboard shortcut Ctrl+K Shift+Enter on Windows or Cmd+K Shift+Enter on Mac on the opened tab

Pinned tabs shift to the left and display a small pin icon so you can spot them instantly.

How to Unpin a Tab

  • Right‑click the pinned tab
  • Select Unpin
  • Or use the keyboard shortcut Ctrl+K Shift+Enter on Windows or Cmd+K Shift+Enter on Mac on the opened tab

Why It Helps

This simple feature keeps your workspace organized and reduces the mental overhead of hunting for files you didn’t mean to close.

It’s one of those small productivity boosts that adds up over time.

Got a favorite shortcut or workflow tweak? Share it in the comments and subscribe to dvlprlife.com for more Quick Tips like this one!

Permanent link to this article: https://www.dvlprlife.com/2026/01/quick-tips-pin-tabs-in-vs-code/

Upgrading Extensions in Business Central: Version Checks and Upgrade Tags

If you’ve developed extensions for any length of time, you’ve probably learned the hard way that “upgrade code” is not where you want surprises.

You’re running in a system session, on customer data, in an environment update window, and whatever you do needs to be repeatable, safe, and fast.

Today, we’ll look at two common ways to control upgrade logic in Business Central:

  • Checking versions (What version am I upgrading from?)
  • Using upgrade tags (Has this upgrade step already run?)

Both approaches work, and you may often use a mix.

What Is an Upgrade Codeunit?

An upgrade codeunit is a codeunit with SubType = Upgrade; that Business Central executes as part of an extension upgrade.

There are two important scopes:

  • Per-company upgrade: Runs once per company.
  • Per-database upgrade: Runs once for the whole tenant/database.

A typical upgrade codeunit is structured around the upgrade triggers (preconditions → upgrade → validation). You’ll see these in Microsoft’s “Upgrading extensions” documentation.

Learn more about upgrading extensions and Upgrade codeunit here.

Controlling Upgrade Logic with Version Checks

The “classic” approach is to gate each upgrade step by checking the version you’re upgrading from.

In upgrade code, you can read version information using NavApp.GetCurrentModuleInfo and then compare against ModuleInfo.DataVersion() (commonly interpreted as the data version you’re upgrading from).

Example (simplified):

codeunit 50160 "DVLPR Upgrade"
{
    SubType = Upgrade;

    trigger OnUpgradePerCompany()
    var
        CurrentModuleInfo: ModuleInfo;
    begin
        NavApp.GetCurrentModuleInfo(CurrentModuleInfo);

        // Only run this step when upgrading from versions older than 2.0.0.0
        if CurrentModuleInfo.DataVersion() < Version.Create(2, 0, 0, 0) then
            UpgradeStep_200();
    end;

    local procedure UpgradeStep_200()
    begin
        // data transformation, backfill, etc.
    end;
}

This pattern is straightforward and works well when:

  • You have a small number of versions to support.
  • Each upgrade step cleanly maps to a specific “from version” range.

Where it gets messy is when you’ve shipped many versions, had hotfixes, or need to make upgrade code resilient against partial runs.

The Problem with Pure Version Checks

Version checks alone don’t tell you whether the step has already executed successfully.

For example:

  • A tenant may have attempted an update, failed halfway through, and then retried.
  • You may ship a fix that needs to run even if the version comparison still matches.
  • You may be backporting an upgrade step into a servicing build.

In those cases, “Did we already run this exact step?” is a better question than, “What version are we upgrading from?”

Upgrade Tags: A Reliable Way to Make Upgrade Steps Idempotent

Upgrade tags are essentially a durable marker that says: “This specific upgrade step has been completed.”

Business Central provides the System.Upgrade codeunit “Upgrade Tag” (ID 9999) for this.

If you ever need to see what tags currently exist in an environment, you can open page 9985 Upgrade Tags to view the stored upgrade tags.

Microsoft Learn reference: Codeunit “Upgrade Tag”

The two methods you’ll use most often are:

  • HasUpgradeTag(Tag: Code[250]): Boolean
  • SetUpgradeTag(NewTag: Code[250])

(There are also database-scoped methods like HasDatabaseUpgradeTag and SetDatabaseUpgradeTag when you need a tag that applies at the database level.)

Example: Using HasUpgradeTag / SetUpgradeTag in an Upgrade Codeunit

Here’s the basic pattern:

  1. Check if the tag exists.
  2. If it does, exit (step already completed).
  3. Run the upgrade logic.
  4. Set the tag.
codeunit 50161 "DVLPR Upgrade With Tags"
{
    SubType = Upgrade;

    trigger OnUpgradePerCompany()
    var
        UpgradeTag: Codeunit "Upgrade Tag";
        Tag: Code[250];
    begin
        Tag := 'DVLPR-50161-PerformUpgradeSomething-20260105';
        if UpgradeTag.HasUpgradeTag(Tag) then
            exit;

        PerformUpgradeSomething();

        UpgradeTag.SetUpgradeTag(Tag);
    end;

    local procedure PerformUpgradeSomething()
    begin
        // data transformation, backfill, etc.
    end;
}

A few notes:

  • The tag format is up to you, but make it unique and traceable. A common pattern is CompanyPrefix-WorkItem-Description-YYYYMMDD.
  • The critical part is when you set the tag—set it only after the step is complete.

Version Checks + Upgrade Tags Together

In many real upgrades, the best solution is a hybrid:

  • Use version checks to decide whether a step is relevant.
  • Use an upgrade tag to ensure the step runs once, at most.

Example:

trigger OnUpgradePerCompany()
var
    CurrentModuleInfo: ModuleInfo;
    UpgradeTag: Codeunit "Upgrade Tag";
    Tag: Code[250];
begin
    NavApp.GetCurrentModuleInfo(CurrentModuleInfo);

    if CurrentModuleInfo.DataVersion() >= Version.Create(2, 0, 0, 0) then
        exit;

    Tag := 'DVLPR-200-UpgradeStep-20260105';
    if UpgradeTag.HasUpgradeTag(Tag) then
        exit;

    UpgradeStep_200();
    UpgradeTag.SetUpgradeTag(Tag);
end;

A Practical Reminder: Order Isn’t Guaranteed Across Upgrade Codeunits

One important design point from the platform: If you have multiple upgrade codeunits, the execution order between different upgrade codeunits is not something you should rely on.

Keep upgrade steps independent, or consolidate logically dependent work into a single upgrade codeunit.

Wrapping Up

You can control upgrade code in Business Central by checking versions, but upgrade tags add a second layer of safety: They let you make individual upgrade steps idempotent and resilient across retries and long-lived version histories.

If you’re building an extension that you expect to ship and maintain for years, upgrade tags are one of the best tools you can adopt early on.

Learn more:

Note: The code and information discussed in this article are for informational and demonstration purposes only. Always test upgrade code in a sandbox and make sure it behaves correctly on a copy of production data. This content was written referencing Microsoft Dynamics 365 Business Central 2025 Wave 2 online.

Permanent link to this article: https://www.dvlprlife.com/2026/01/upgrading-extensions-in-business-central-version-checks-and-upgrade-tags/

The Access Property in Business Central (AL)

If you build extensions long enough, you eventually get burned by a dependency you didn’t mean to create: another app starts calling into your “helper” codeunit, or relies on a table/field you never intended as public API. Later, you refactor—and everything breaks.

Local Access Levels are not accessible outside the table or table extension

That’s the real value of the Access property: it lets you define (at compile time) what’s part of your supported surface area versus what’s strictly internal implementation.

What Is the Access Property?

The Access property sets the compile-time visibility of an AL object (and, for tables, of individual fields). In practice, it controls whether other AL code can take a direct reference to that symbol.

Think of it as the AL equivalent of “public vs internal” in other languages, but scoped to apps (modules) and extension objects.

What the Access Property Applies To

Access applies to:

  • Codeunit
  • Query
  • Table
  • Table field
  • Enum
  • Interface
  • PermissionSet

Object-Level Access: Public vs Internal

For most objects (tables, codeunits, queries, interfaces, enums), you’ll typically use:

  • Access = Public; (default): Other apps that reference your app can use the object.
  • Access = Internal;: Only code inside your app can reference the object.

Example: keep a table internal so nobody else compiles against it:

table 50111 "DVLPR Internal Staging"
{
    DataClassification = CustomerContent;
    Access = Internal;

    fields
    {
        field(1; "Entry No."; Integer) { }
        field(2; Payload; Blob) { }
    }
}

Table Field Access: Local, Protected, Internal, Public

Table fields add two extra options that are incredibly useful for designing clean extensibility:

  • Local: Only code in the same table or the same table extension object where the field is defined can reference the field.
  • Protected: Code in the base table and table extensions of that table can reference the field.
  • Internal: Anything inside the same app can reference the field.
  • Public (default): Any referencing app can reference the field.

Example: Table with different field access levels:

table 50140 "DVLPR Access Property"
{
    Access = Public;
    Caption = 'DVLPR';
    DataClassification = CustomerContent;

    fields
    {
        field(1; "Code"; Code[10])
        {
            Caption = 'Code';
            ToolTip = 'Specifies the value of the Code field.';
        }
        field(2; "Local Code"; Code[10])
        {
            Access = Local;
            Caption = 'Local Code';
            ToolTip = 'Specifies the value of the Local Code field.';
        }
        field(3; "Protected Code"; Code[10])
        {
            Access = Protected;
            Caption = 'Protected Code';
            ToolTip = 'Specifies the value of the Protected Code field.';
        }
        field(4; "Public Code"; Code[10])
        {
            Access = Public;
            Caption = 'Public Code';
            ToolTip = 'Specifies the value of the Public Code field.';
        }
        field(5; "Internal Code"; Code[10])
        {
            Access = Internal;
            Caption = 'Internal Code';
            ToolTip = 'Specifies the value of the Internal Code field.';
        }
    }
    keys
    {
        key(PK; "Code")
        {
            Clustered = true;
        }
    }
}

The Access levels for table fields are especially useful when you want to allow controlled extensibility without opening up everything.

Local and Protected Access Levels are not accessible outside the table or tableextension

If, for example, you have a field with Access = Local, you won’t be able to reference it by name from a page, report, or codeunit—even inside the same app.

One more practical detail from the platform: table and field accessibility affects the in-client Designer. Only Public table fields can be added to pages using Designer.

Sharing Internals Between Your Own Apps: internalsVisibleTo

Sometimes you do want internals shared—but only with your own “companion” apps. That’s where internalsVisibleTo in app.json comes in.

It allows specific friend modules to compile against your Access = Internal objects.

Example app.json snippet:

{
  "internalsVisibleTo": [
    {
      "id": "00000000-0000-0000-0000-000000000000",
      "name": "DVLPR Companion App",
      "publisher": "DVLPRLIFE"
    }
  ]
}

Important: Access Is Compile-Time Only (Not Security)

This is the part that’s easy to misunderstand.

Access is enforced at compile time. It is not a runtime security boundary.

One way to think about it: Access controls who can compile against your symbols, not who can ultimately interact with data at runtime. Business Central still has reflection-style mechanisms (such as RecordRef, FieldRef, and TransferFields) that can work with tables/fields without a direct symbol reference.

For example, even though the Local Code field is marked as Access = Local, you can still technically read and write it using RecordRef and FieldRef (because those APIs work by field number rather than a compile-time field reference):

    procedure GetLocalCode(): Code[10]
    var
        RecordRef: RecordRef;
        FieldRef: FieldRef;
        LocalCode: Code[10];
    begin
        RecordRef.GetTable(Rec);
        FieldRef := RecordRef.Field(2);
        LocalCode := FieldRef.Value;
        RecordRef.Close();

        exit(LocalCode);
    end;

    procedure SetLocalCode(NewLocalCode: Code[10])
    var
        RecordRef: RecordRef;
        FieldRef: FieldRef;
    begin
        RecordRef.GetTable(Rec);
        FieldRef := RecordRef.Field(2);
        FieldRef.Value := NewLocalCode;
        RecordRef.Modify();
        RecordRef.Close();
    end;

When I Reach for Each Level

My personal defaults:

  • Public: Objects/fields I’m willing to support as a stable contract.
  • Internal: Implementation objects I expect to refactor freely.
  • Protected (fields): When I want controlled extensibility through table extensions.
  • Local (fields): Fields that are strictly internal to the table logic.

Wrapping Up

The Access property is one of the most practical tools you have for keeping an extension maintainable over time. It helps you draw a clear line between API and implementation, reduces accidental coupling between apps, and makes your intent obvious to anyone reading your symbols.

Learn more about access modifiers here.

Learn more about the Access property here.

Learn more about internalsVisibleTo in the app.json schema here.

Note: The code and information discussed in this article are for informational and demonstration purposes only. The Access property is available from runtime version 4.0.

Permanent link to this article: https://www.dvlprlife.com/2025/12/the-access-property-in-business-central-al/

Feeling Behind, Looking Ahead: What the Future Holds for Developers in the Era of AI

A Moment Worth Pausing For

Let Karpathy’s words sink in for a moment.

Just 11 months ago, in February 2025, Andrej Karpathy—one of the most influential voices in modern AI—casually coined the term “vibe coding.” It was a playful phrase, yet it captured something profound: the exhilarating chaos of letting large language models interact with us, improvize with us, and sometimes surprise us more than we expected.

Less than a year ago, it felt like we were at the frontier. And yet here we are, just a few months later, and Karpathy himself is saying he’s never felt more behind.

If he feels that way, what does that mean for the rest of us?

The Ground Has Shifted—Again

As we approach 2026, the landscape has transformed at a pace that feels almost like sci-fi. The tools we once treated as clever assistants have evolved into something far more powerful—and far less predictable.
We’re no longer just prompting models. We’re orchestrating:

  • Agents that act on their own,
  • Workflows that chain intelligence together,
  • Systems that behave less like tools and more like collaborators with quirks, instincts, and emergent behaviors.

This isn’t just a new framework or a new library. It’s a new layer of abstraction—one that demands we rethink how software is conceived, built, and maintained.

It’s disorienting. It’s thrilling. And yes, it can make even the best of us feel like we’re scrambling to keep up.

The Beginning of a New Epoch

But here’s the beauty: we’re not witnessing the death of an era. We’re witnessing the birth of a new one.

This earthquake isn’t leveling the field—it’s clearing it. The old assumptions, the old constraints, the old rhythms of development are giving way to something unprecedented to take root.

For the first time in decades, the craft of programming is being reinvented in real time. Every developer alive today has a front‑row seat to a transformation that future generations will study, much in the same way we reflect back on the dawn of the internet.

Why Feeling Behind Is a Good Sign

That feeling of being behind? While it can shake us at our foundations, it’s the unmistakable signal that we are alive in a crucial historical moment.

When the world accelerates, the sensation of lagging is evidence you’re still in the race. Still learning. Still adapting. Still alert.

And in a field defined by reinvention, that’s exactly where you want to be.

The New Developer Mindset

So yes—roll up your sleeves. Explore the tools. Break things. Build things. Let agents surprise you. Let workflows confuse you. Let the unpredictability both scare you and teach you.

Those who will thrive in this era won’t be the ones who memorize every new capability or master every new abstraction overnight. They’ll be the ones who approach this moment with curiosity, humility, and a willingness to play.

They are those who embrace the beginner’s humility and the learner’s mindset.

We’re All Newbies Again

Relish this chance to be a newbie. For the first time in a long time, everyone is starting fresh.

The veterans. The newcomers. The researchers. The hobbyists. The people who’ve been coding for 30 years and the ones who just wrote their first prompt last week.

We’re all standing at the same threshold, staring into a future that’s bigger, stranger, and more full of possibility than anything we’ve built before.

And what an inspiring place and time to be in.

Permanent link to this article: https://www.dvlprlife.com/2025/12/feeling-behind-looking-ahead/

Protected Variables in Business Central (AL)

I’ve been writing Business Central extensions for many years and one thing I’ve always wanted is a clean way to access the “working” variables inside the objects I’m extending. Too often, you end up re-implementing the same logic in a pageextension or reportextension just to get at a flag, buffer, or calculated value. Fortunately, there’s a solution for that: protected variables.

What Are Protected Variables?

Protected variables are global variables that an AL object intentionally exposes to extensions in a controlled way.

In AL, you declare them in a protected var section. That makes them accessible to extension objects that extend the source object, such as:

  • tables ↔ table extensions
  • pages ↔ page extensions
  • reports ↔ report extensions
  • dependent apps (extensions) when there is an explicit app dependency

This is especially useful when you have internal states (flags, buffers, counters, temporary records) that extensions legitimately need to read or toggle—without forcing everyone into copy/paste base logic.

Why Protected Variables Exist (And What They Replace)

Before protected variables, developers typically had to choose between:

  • Making a variable not accessible (so extensions can’t reuse it), or
  • Reworking the design into events/procedures, or
  • Duplicating logic in extensions (fragile and expensive)

Protected variables fill a pragmatic gap: they let the base object expose “just enough” internal states to extension objects.

Syntax: protected var vs var

The syntax is simple, but important:

protected var
    MyProtectedValue: Boolean;

var
    MyLocalOnlyValue: Integer;
  • Variables in protected var are accessible to extension objects that extend the source object.
  • Variables in var are local to the object and not accessible from extensions.

If you want to expose only some variables, you must split declarations into two sections (as shown above).

Example: Exposing a Page Flag to a Page Extension

This is a common real-world pattern: a base page maintains a flag that controls visibility or behavior and the extension needs to reuse that same flag.

Base page:

page 50100 "DVLPR My Page"
{
    SourceTable = Customer;
    PageType = Card;

    layout
    {
        area(Content)
        {
            group(Advanced)
            {
                Visible = ShowBalance;

                field(Balance; Balance)
                {
                    ApplicationArea = All;
                }
            }
        }
    }

    actions
    {
        area(Processing)
        {
            action(ToggleBalance)
            {
                ApplicationArea = All;
                trigger OnAction()
                begin
                    ShowBalance := not ShowBalance;
                end;
            }
        }
    }

    protected var
        ShowBalance: Boolean;
}

Page extension:

pageextension 50101 "DVLPR My Page Ext" extends "DVLPR My Page"
{
    layout
    {
        addlast(Content)
        {
            group(MoreBalance)
            {
                Visible = ShowBalance;

                field("Balance (LCY)"; "Balance (LCY)")
                {
                    ApplicationArea = All;
                }
            }
        }
    }
}

This is the difference between a clean extension and one that has to reimplement the base page’s behavior.

Benefits (What You Actually Gain)

  • Better extensibility contracts: You can intentionally expose state that is useful to extensions.
  • Less copy/paste logic: Extensions can build on the base behavior without recreating it.
  • Cleaner page/report extensions: You can reuse the base object’s “working variables” (visibility flags, buffer values, temporary records).
  • Cross-app collaboration: If App B depends on App A, App B can access App A’s protected variables when extending App A’s objects.

What Protected Variables Are

  • They are shared mutable states. An extension can change the value, potentially causing side effects.
  • They create coupling: If the base object later changes or removes the variable, dependent extensions may need to be updated.

If you need strict validation, invariants, or long-term stability, a protected/public procedure (or an event with parameters) is often a better design than exposing the variable directly.

Rules of Thumb and When to Use Them

I tend to reach for protected var when:

  • The variable is part of the object’s internal UI/state (visibility flags, cached totals, temporary buffers).
  • The extension genuinely needs to share the same state as the base object.
  • Exposing an entire public procedure would be overkill.

I avoid protected var when:

  • The extension should not mutate the value (prefer a procedure).
  • The value represents a business invariant (prefer validation + procedures/events).

Version Notes and Gotchas

Protected variables have been around for several Business Central versions, but you should still test your specific pattern on the oldest version you support.

A couple of practical lessons from the community:

  • If you’re binding values in a page extension and the source is “complex” (arrays, temporary buffers, etc.), it can be safer to stage the value into your own variable in a trigger (for example, OnAfterGetRecord) instead of binding directly.
  • Keep protected var focused on state you expect extensions to use. When the value needs validation or invariants, expose a procedure/event instead.

Further Reading

  • Microsoft Learn (protected variables): here

Wrapping Up

Protected variables are one of those small AL features that make extension design feel much more natural. When used intentionally, they allow page/report/table extensions to integrate tightly with the base object’s state.

Use them as a targeted extensibility tool: expose only what’s needed, keep the surface area small, and choose procedures/events when you need validation or a long-term stable API.

Note: The information in this article is for informational and demonstration purposes only. Protected variables apply to Business Central 2019 release wave 2 and later. Always test on the lowest supported Business Central version.

Permanent link to this article: https://www.dvlprlife.com/2025/12/protected-variables-in-business-central-al/

Delete Orphaned Extension Data in Business Central

What Is “Orphaned Extension Data”?

In Business Central, when uninstalling an extension, the person performing the uninstall can choose to preserve its data. That’s intentional: it allows you to reinstall the extension later without losing data.

The downside is that you can end up with data for an extension that is no longer published. That leftover content is what people generally mean by orphaned extension data.

Over time—especially in environments that frequently cycle apps—this can add up and impact both storage and performance. Highlighting the importance of cleaning orphaned data helps administrators understand its impact on system health and encourages proactive maintenance.

The Feature: Delete Orphaned Extension Data

In Dynamics 365 Business Central 2023 release wave 2, Microsoft introduced a built-in way to clean this up: the Delete Orphaned Extension Data page. This feature targets only extensions that are not currently installed but still have data.

It lets an admin:

  • See which uninstalled extensions still have data in the tenant.
  • Select one or more of those extensions.
  • Permanently delete the leftover data for those uninstalled extensions.

Read more about deleting orphaned extension data here.

Why You Should Care (Performance + Capacity)

Even if the extension is gone, the data it created can still have a cost:

  • Storage/capacity: orphaned data takes space until you remove it.
  • Table extension overhead: table-extension fields are stored by the platform in companion tables. Leaving old extension data behind can increase row size and overhead.
  • Upgrades and maintenance: less “dead” data usually means faster maintenance operations and fewer surprises.

Microsoft’s performance guidance is very consistent on this theme: keep the database lean.

How to Use It (In the Client)

You can run the cleanup directly from the Business Central client:

  1. Use Tell me and search for Delete Orphaned Extension Data.

  1. Open the page.
  2. Review the list of uninstalled extensions that still have data.
  3. Select the extension(s) you want to clean up.

  1. Choose Delete data (or the equivalent action).

If you’re looking for where extension install/uninstall is managed, that’s typically done from Extension Management.

Read more on Extension Management here.

What Actually Gets Deleted?

At a high level, this cleanup removes data that belongs to the selected uninstalled extension(s), including:

  • Data in extension-owned tables
  • Data stored for table extensions (platform companion table data)

Deleting orphan data is a destructive operation. Once you delete extension data, it’s not something you can “undo” from inside Business Central. Always test in a sandbox first to build confidence in safe execution, as this reassures admins about data safety.

If you’re unsure whether it’s safe, the best approach is:

  • Validate the extension is not needed
  • Confirm you have a backup/restore option
  • Test the cleanup in a Sandbox first

Permissions / Who Should Do This

Deleting Orphan Extension Data is an admin maintenance task and typically requires extension-management/admin permissions in the tenant.

If a user can’t see the page or actions, that’s usually the first thing to check.

Wrapping Up

The Delete Orphaned Extension Data feature is a small admin tool with a significant long-term payoff, empowering you to optimize capacity and improve performance by removing leftover extension data.

Note: The information in this article is for informational and demonstration purposes only. This content was written with reference to Microsoft Dynamics 365 Business Central 2023 release wave 2 Online and later. Always test cleanup in a sandbox first and ensure you have a recovery path before deleting data in production.

Permanent link to this article: https://www.dvlprlife.com/2025/12/delete-orphaned-extension-data-in-business-central/

Package Resources in Extensions and Access Them from AL

Many features need “starter” content: setup templates, default JSON config, RapidStart packages, demo data, or even HTML/email templates. Historically, AL developers ended up stuffing this kind of content into labels, giant Text constants, or helper Codeunits.

With Microsoft Dynamics 365 Business Central 2024 Wave 2, you can package resource files inside your extension and read them at runtime directly from AL. This is great for setup and initialization scenarios because it keeps content in real files (versionable, editable, diff-friendly) instead of in code.

Learn more about the feature here

You can find the full code for the example on GitHub.

How Resource Packaging Works

At a high level:

  • You add one or more folders to your extension that contain your resources.
  • You declare those folders in your manifest (app.json) using resourceFolders.
  • At runtime, AL reads the resource content using the NavApp data type (for example, NavApp.GetResource).

A key point from the release plan: an extension can access only its own resources.

Defining Resource Folders in app.json

To package resources, declare the folders in your project that contain them by adding resourceFolders to app.json. The resourceFolders property contains a list of folders that contain resources that should be packaged as part of the app file. You can specify multiple folders, and each folder can contain subfolders. The resourceFolders should be listed relative to the root of your project.

Example:

{
  "id": "00000000-0000-0000-0000-000000000000",
  "name": "getresource1",
  "publisher": "Default Publisher",
  "version": "1.0.0.0",
  "platform": "1.0.0.0",
  "application": "27.0.0.0",
  "runtime": "16.0",
  "resourceFolders": ["resources"]
}

Anything under those folders is packaged into the .app.

Resource Limits (Worth Knowing Up Front)

This feature has a few practical limits:

  • Any single resource file can be up to 16 MB.
  • All resource files together can be up to 256 MB.
  • An extension can have up to 256 resource files.

What resources do you have?

With so many resources, you might want to see what’s available. You can use NavApp.ListResources to get a list of all packaged resources, optionally filtered by a path prefix.

Learn more about NavApp.ListResources here.

NavApp.GetResource: The Core Building Block

NavApp.GetResource retrieves a resource that was packaged with the current app and loads it into an InStream.

Syntax (from docs):

NavApp.GetResource(ResourceName: Text, var ResourceStream: InStream [, Encoding: TextEncoding])
  • ResourceName: the name/path of the resource you want to retrieve.
  • ResourceStream: an InStream variable that receives the resource content.
  • Encoding (optional): stream encoding (default is MSDos). In practice, you’ll usually want TextEncoding::UTF8 for JSON and text templates.

Learn more about NavApp.GetResource here.

  • Wrong resource name/path: the resource name must match the packaged path. Keep your folder structure simple and consistent.

The Other Methods

If your resource is plain text or JSON, you can often skip the InStream plumbing, although it still works, and use the additional methods instead:

  • NavApp.GetResourceAsText(Text [, TextEncoding]) returns the resource directly as Text.
  • NavApp.GetResourceAsJson(Text [, TextEncoding] returns the resource directly as a JsonObject.

If you’re only dealing with text or JSON, those convenience methods can make your code shorter. If you need streaming semantics (or want to control how text is read), NavApp.GetResource is the most general option.

Learn more about NavApp.GetResourceAsText(Text [, TextEncoding]) here.

Learn more about NavApp.GetResourceAsJson(Text [, TextEncoding]) here.

Example 1: Load an Image Resource with NavApp.GetResourceAsJson

A very common real-world scenario is packaging different data configurations as JSON resources, then reading them at runtime.

var
    JSONText: Text;
    ResourceJSONFileLbl: Label 'json/items.json';

trigger OnOpenPage()
begin
    this.JSONText := this.LoadJSON(this.ResourceJSONFileLbl);
end;

local procedure LoadJSON(resource: text): Text
var
    ResourceJson: JsonObject;
    JSON: Text;
begin
    ResourceJson := NavApp.GetResourceAsJson(resource, TextEncoding::UTF8);
    ResourceJson.WriteTo(JSON);
    exit(JSON);
end;

This keeps your HTML out of AL, while still letting AL “inject” runtime data.

Example 2: Load a Text Resource with NavApp.GetResourceAsText

Another real-world scenario is packaging an email template as a resource, then reading it at runtime. This allows you to provide a default email template that can be easily updated by changing the resource file, without modifying the AL code.

var
    SampleText: Text;
    ResourceTextFileLbl: Label 'text/sample.txt';

trigger OnOpenPage()
begin
    this.SampleText := this.LoadText(this.ResourceTextFileLbl);
end;
local procedure LoadText(resource: text): Text
begin
    exit(NavApp.GetResourceAsText(resource, TextEncoding::UTF8));
end;

Example 3: Listing All Packaged Resources

With so many resources, you might want to see and select from the list of resources to present as a selection option.

trigger OnOpenPage()
begin
    this.AllResourceNames := this.ResourceTextList('');
    this.ImageResourceNames := this.ResourceTextList('images');
end;

local procedure ResourceTextList(filter: Text): Text
var
    ResourceList: List of [Text];
    ResourceNames: Text;
    resourceIndex: Integer;
begin
    ResourceList := NavApp.ListResources(filter);
    for resourceIndex := 1 to ResourceList.Count() do
        ResourceNames := resourceIndex = ResourceList.Count() ? ResourceNames + ' ' + ResourceList.Get(resourceIndex) : ResourceNames + ' ' + ResourceList.Get(resourceIndex) + '\';

    exit(ResourceNames);
end;

Wrapping Up

Packaging resources in extensions (and reading them from AL) is one of those quality-of-life features that quickly becomes a standard pattern. It makes setup and initialization cleaner, keeps content in real files, and reduces the temptation to hardcode large templates or JSON blobs in AL. You can also allow for the selection of resources to load at runtime, enabling more dynamic behavior – perfect for loading different email templates based on user preferences or configurations and also demonstration data scenarios.

You can find the full code for the example on GitHub.

Note: The code and information discussed in this article are for informational and demonstration purposes only. This content was written referencing Microsoft Dynamics 365 Business Central 2025 Wave 2 online.

Permanent link to this article: https://www.dvlprlife.com/2025/12/package-resources-in-extensions-and-access-them-from-al/

Control Add-in Object in Business Central

What Is a Control Add-in in Business Central?

A control add-in is an AL object you use to embed a custom web-based UI component inside the Business Central client. Think of it as a bridge between the Business Central page framework and HTML/JavaScript/CSS running in the browser client. The client hosts the add-in on a page (typically rendered in an iframe) and loads the JavaScript and CSS packaged with your extension.

The key concept is that Business Central renders the add-in in the client, and you communicate between AL and JavaScript using events (JS → AL) and procedures (AL → JS).

I’ve been having a lot of fun building control add-ins and vibe-coded something fun for the holiday.

You can find the full code for the example on GitHub.

How Control Add-ins Work

When a page that contains a usercontrol is opened, the Business Central web client loads the add-in resources packaged in your extension (JavaScript, CSS, images). The add-in renders into a host container in the page.

From there, the integration is two-way:

  • JavaScript raises events back to AL using Microsoft.Dynamics.NAV.InvokeExtensibilityMethod('EventName', [args]).
  • AL calls JavaScript functions by invoking procedures declared on the controladdin object (which must exist in your JS runtime).

You can think of it as:

  • Events: “JavaScript is telling AL something happened.”
  • Procedures: “AL is telling JavaScript to update the UI.”

Creating a Control Add-in Object

To get started, you’ll typically:

  1. Create a controladdin object in AL.
  2. Add your JS/CSS files to the extension (often under an addin/ or controladdin/ folder).
  3. Reference those files from Scripts, StartupScript, and StyleSheets.
  4. Define events (JS → AL) and procedures (AL → JS).
  5. Place it on a page using a usercontrol.

Here’s the control add-in definition for my holiday example:

controladdin DVLPRControlAddIn
{
    HorizontalShrink = true;
    HorizontalStretch = true;
    MaximumHeight = 300;
    MaximumWidth = 700;
    MinimumHeight = 300;
    MinimumWidth = 700;
    RequestedHeight = 300;
    RequestedWidth = 700;
    Scripts = 'controladdin/scripts.js';
    StartupScript = 'controladdin/start.js';
    StyleSheets = 'controladdin/style.css';
    VerticalShrink = true;
    VerticalStretch = true;

    procedure Animate()
    procedure Render(html: Text);
    event OnControlAddInReady();
    event ShowError(ErrorTxt: Text);
}

A few notes on those properties:

  • StartupScript is typically used to bootstrap the control and indicate the initial trigger to invoke on the page that contains the add-in.
  • Scripts is where you put the bulk of your implementation (functions that AL procedures call, helpers, etc.).
  • StyleSheets is optional, but recommended for maintainability.
  • Sizing properties (RequestedHeight, MinimumHeight, VerticalStretch, etc.) help your add-in behave predictably in pages.

Using the Control Add-in on a Page

Once the controladdin exists, you host it on a page via usercontrol. Below is a simple Card page example that:

  • Receives a JavaScript event when the control is loaded and ready (JS → AL event).
  • Calls JavaScript procedures to render HTML and start an animation (AL → JS procedures).
page 50100 "DVLPR Christmas Tree Page"
{
    ApplicationArea = All;
    Caption = 'Christmas Tree';
    UsageCategory = Lists;

    layout
    {
        area(Content)
        {
            group(controls)
            {
                Caption = 'Merry Christmas!';
                usercontrol(PageControlAddIn; DVLPRControlAddIn)
                {
                    trigger OnControlAddInReady()
                    begin
                        CurrPage.PageControlAddIn.Render(@'
                        <div id="scrolltext">Merry Christmas!</div>
                        <div class="tree">
                            <div class="lights">
                                <div class="light"></div>
                                <div class="light"></div>
                                <div class="light"></div>
                                <div class="light"></div>
                                <div class="light"></div>
                                <div class="stump"></div>
                            </div>
                        </div>');
                        CurrPage.PageControlAddIn.Animate();
                    end;

                    trigger ShowError(ErrorTxt: Text)
                    begin
                        Error(ErrorTxt);
                    end;
                }
            }
        }
    }
}

OnControlAddInReady() is your “safe moment” to start calling procedures into JavaScript, because the client has loaded the resources and the JavaScript runtime is initialized.

Note: In Business Central 2025 Wave 1 and later, you can also use the new UserControlHost page type to host control add-ins in a full-page experience.

Learn more about that here.

JavaScript: Rendering UI and Calling Back into AL

Now for the JavaScript side. The easiest pattern is:

  • In start.js: signal that the add-in is ready.
  • When something happens: call Microsoft.Dynamics.NAV.InvokeExtensibilityMethod(...) with the event name defined in AL. The Business Central client will route that to your AL event handler.

controladdin/start.js

Microsoft.Dynamics.NAV.InvokeExtensibilityMethod('OnControlAddInReady', []);

That InvokeExtensibilityMethod call maps directly to the AL event:

trigger OnControlAddInReady()
begin
end;

So the Business Central client will invoke the trigger OnControlAddInReady() block inside your usercontrol.

JavaScript: Implementing AL Procedures (AL → JS)

If you declare a procedure in the controladdin object, you must implement a matching function in JavaScript so AL can call it.

From the AL object:

    procedure Render(html: Text);

Implement it in a JS file you included under Scripts (for example scripts.js):

controladdin/scripts.js

function Render(html) {
    try {
        document.getElementById('controlAddIn').innerHTML = html;
    }
    catch (e) {
        Microsoft.Dynamics.NAV.InvokeExtensibilityMethod('ShowError', [e.toString()]);
    }
}

Now this AL call will work (it passes HTML to render):

CurrPage.PageControlAddIn.Render(@'
                        <div id="scrolltext">Merry Christmas!</div>
                        <div class="tree">
                            <div class="lights">
                                <div class="light"></div>
                                <div class="light"></div>
                                <div class="light"></div>
                                <div class="light"></div>
                                <div class="light"></div>
                                <div class="stump"></div>
                            </div>
                        </div>');

CSS: Keep It Simple and Contained

A small stylesheet helps keep the markup readable:

controladdin/style.css

body {
    display: flex;
    justify-content: center;
    align-items: center;
    height: 100vh;
    background-color: #000;
    color: #fff;
    font-family: Arial, sans-serif;
}

#scrolltext {
    position: absolute;
    top: 50px;
    left: 50px;
    font-size: 24px;
    overflow-x: hidden;
    white-space: nowrap;
}

.tree {
    position: absolute;
    top: 180px;
    left: 320px;
    width: 40;
    height: 0;
    border-left: 50px solid transparent;
    border-right: 50px solid transparent;
    border-bottom: 100px solid green;
    margin-bottom: -30px;
}

.tree:before {
    content: '';
    position: absolute;
    top: -50px;
    left: -25px;
    width: 0;
    height: 0;
    border-left: 25px solid transparent;
    border-right: 25px solid transparent;
    border-bottom: 50px solid green;
}

.tree:after {
    content: '';
    position: absolute;
    top: -80px;
    left: -15px;
    width: 0;
    height: 0;
    border-left: 15px solid transparent;
    border-right: 15px solid transparent;
    border-bottom: 30px solid green;
}

.stump {
    position: absolute;
    top: 190px;
    left: 5px;
    width: 20px;
    height: 20px;
    background-color: brown;
}

.lights {
    position: absolute;
    top: -90px;
    left: -15px;
    width: 30px;
    height: 170px;
    display: flex;
    flex-direction: column;
    justify-content: space-between;
    align-items: center;
}

.light {
    width: 10px;
    height: 10px;
    border-radius: 50%;
    background-color: red;
    animation: blink 1s infinite;
}

.light:nth-child(2) {
    background-color: yellow;
    animation-delay: 0.2s;
}

.light:nth-child(3) {
    background-color: blue;
    animation-delay: 0.4s;
}

.light:nth-child(4) {
    background-color: white;
    animation-delay: 0.6s;
}

.light:nth-child(5) {
    background-color: orange;
    animation-delay: 0.8s;
}

@keyframes blink {

    0%,
    100% {
        opacity: 1;
    }

    50% {
        opacity: 0.5;
    }
}

(Keep your CSS scoped to your own classes so you don’t accidentally affect the surrounding Business Central page.)

Common Pitfalls (That Everyone Hits Once)

  • Calling procedures before the control is ready: use OnControlAddInReady() for initialization calls.
  • Event name mismatches: the string you pass to InvokeExtensibilityMethod('OnControlAddInReady', ...) (or ShowError) must match the AL event name exactly.
  • Trying to do server work in the add-in: treat it as UI; keep business logic in AL/codeunits.

Wrapping Up

Control add-ins are helpful when you need a richer client experience than standard AL page controls can provide. Once you learn the basic rhythm—declare events/procedures in AL, implement the UI in JavaScript, and connect them with InvokeExtensibilityMethod—you can build surprisingly powerful UI integrations (I’ve even created a few games within Business Central—more on that later) while keeping business logic in AL.

Learn more about the control add-in object here.

You can find the full code for the example on GitHub.

Note: The code and information discussed in this article are for informational and demonstration purposes only. This content was written referencing Microsoft Dynamics 365 Business Central 2025 Wave 2 online.

Permanent link to this article: https://www.dvlprlife.com/2025/12/control-add-in-object-in-business-central/

December 2025 Cumulative Updates for Dynamics 365 Business Central

The December updates for Microsoft Dynamics 365 Business Central are now available.

Before applying the updates, you should confirm that your implementation is ready for the upgrade and ensure compatibility with your modifications. Work with a Microsoft Partner to determine if you are ready and what is needed for you to apply the update.

Please note that Online customers will automatically be upgraded to version 27.2 over the coming days/weeks and should receive an email notification when upgraded.

Direct links to the cumulative updates are listed here:

Dynamics 365 Business Central On-Premises 2025 Release Wave 2 – 27.2 (December 2025)

Dynamics 365 Business Central On-Premises 2025 Release Wave 1 – 26.8 (December 2025)

Dynamics 365 Business Central On-Premises 2024 Release Wave 2 – 25.14 (December 2025)

Dynamics 365 Business Central On-Premises 2024 Release Wave 1 – 24.18 (October 2025)

Dynamics 365 Business Central On-Premises 2023 Release Wave 2 – 23.18 (April 2025)

Dynamics 365 Business Central On-Premises 2023 Release Wave 1 Updates – 22.18 (October 2024)

Permanent link to this article: https://www.dvlprlife.com/2025/12/december-2025-cumulative-updates-for-dynamics-365-business-central/