We’re a Chartered Accountant Firm. This initiative is to give updates about laws, regulations, accounting and finance. The same is to create awareness among public at large and be a part in Strong Nation building as we Chartered Accountants are Partners in Nation Building.
The DPDP Act establishes a clear and significant penalty framework. Penalties are imposed by the Data Protection Board of India after due inquiry, giving the accused an opportunity to be heard. The penalties are civil in nature — monetary fines, not criminal prosecution.
The Penalty Schedule
Breach
Maximum Penalty
Failure to implement security safeguards
₹250 crore
Failure to notify breach to Board or individuals
₹200 crore
Breach of children’s data obligations
₹200 crore
Breach of Significant Data Fiduciary obligations
₹150 crore
Any other provision of the Act or Rules
₹50 crore
Breach of duties by a Data Principal
₹10,000
What Factors Determine the Penalty Amount?
The Board does not automatically impose the maximum. It considers:
Nature, gravity, and duration of the breach
Sensitivity of the personal data involved
Whether the breach was repetitive
Whether any gain was made or loss avoided
Whether timely steps were taken to mitigate harm
The likely impact of the penalty on the organisation
Examples
Example 1 — Failure to Secure Data (₹250 crore) A large e-commerce platform stores millions of customer records — names, addresses, and payment details — without encryption or access controls. Hackers exploit this and steal the data. The platform had no reasonable safeguards in place. The Board finds them liable for up to ₹250 crore.
Example 2 — Failure to Report a Breach (₹200 crore) A telecom company discovers that its customer database has been compromised. Instead of notifying the Board and affected customers promptly, it delays disclosure for weeks hoping to manage the situation internally. This failure to notify attracts a penalty of up to ₹200 crore.
Example 3 — Children’s Data Violation (₹200 crore) An ed-tech platform collects data of students under 18 without obtaining verifiable parental consent. It also runs targeted advertisements directed at children on its platform. Both violations together attract a penalty of up to ₹200 crore.
Example 4 — Significant Data Fiduciary Default (₹150 crore) A major social media platform notified as a Significant Data Fiduciary fails to appoint a Data Protection Officer based in India and does not conduct its mandatory annual Data Protection Impact Assessment. The Board imposes a penalty of up to ₹150 crore.
Example 5 — Data Principal Misuse (₹10,000) An individual files repeated false complaints against a company with the Data Protection Board, with no genuine grievance. The Board finds the complaints frivolous and imposes a penalty of up to ₹10,000 on the individual.
Can the Government Go Further?
Yes. If the Board reports that penalties have been imposed on a Data Fiduciary on two or more occasions, the Central Government may direct platforms and intermediaries to block public access to that organisation’s services in India — making repeat non-compliance an existential risk for businesses.
The Key Takeaway
Penalties under the DPDP Act are not symbolic. They are substantial, scalable, and designed to deter. Compliance is not a one-time exercise — it is an ongoing obligation, and the cost of ignoring it far exceeds the cost of getting it right.
Disclaimer
The contents of this post are intended for general awareness and informational purposes only. They do not constitute legal opinion, professional advice, consultancy, statutory interpretation, or a recommendation to act in any particular manner.
The Digital Personal Data Protection Act, 2023, related rules, notifications, regulatory guidance and judicial interpretations may evolve from time to time. The applicability of the law may also vary depending on the facts, sector, nature of data processing, organisational role, contractual terms and compliance framework.
Readers should not rely solely on this post for making legal, business, HR, technology, data-processing or compliance decisions. Specific advice from a qualified legal, privacy, cybersecurity, governance or compliance professional should be obtained before acting on any matter discussed.
The author / publisher shall not be responsible for any loss, liability, claim, penalty or consequence arising from reliance on the contents of this post without independent professional advice.
Consent is the foundation of the DPDP Act. Before collecting or processing any personal data, a Data Fiduciary must obtain consent that meets every one of the following conditions. If even one condition is missing — the consent is invalid.
The Five Pillars of Valid Consent
Free — Consent must not be forced, pressured, or made a condition for a service where the data is not genuinely necessary. The individual must have a real choice.
Specific — Consent must be tied to a clearly defined purpose. A blanket “I agree to everything” is not valid. Each purpose requires its own consent.
Informed — The individual must know exactly what data is being collected, why it is being collected, and what their rights are — before they consent.
Unconditional — Consent cannot be bundled with unrelated terms or conditions. It must stand on its own.
Unambiguous with a Clear Affirmative Action — Silence, pre-ticked boxes, or inaction do not count as consent. The individual must actively and clearly say yes.
What is the Notice Requirement?
Before seeking consent, every Data Fiduciary must serve a Notice to the individual. This notice must be in clear, plain language — not buried in legal jargon. It must be available in English or any language listed in the Eighth Schedule of the Indian Constitution.
The Notice must contain:
What data is being collected — A clear description of the personal data proposed to be processed.
Why it is being collected — The specific purpose for which the data will be used.
How to exercise rights — A clear explanation of how the individual can access, correct, erase their data, or withdraw consent.
How to withdraw consent — The notice must explicitly tell the individual the manner in which they can withdraw consent. This is a distinct and mandatory element, separate from the general rights section. The notice must make clear that consent is limited to data necessary for the specified purpose — the individual should understand they are not consenting to unlimited data collection. The notice must clarify that withdrawing consent will not affect the legality of processing already carried out before withdrawal — so individuals understand what withdrawal does and does not undo. For existing data collected before the Act, the notice obligation is triggered as soon as reasonably practicable — this timeline aspect was mentioned but could be more explicit.
How to complain — Details of how the individual can raise a complaint with the Data Protection Board of India.
Who to contact — Business contact information of the Data Protection Officer or a designated person who can answer questions about data processing.
What About Data Already Collected Before the Act?
If consent was obtained before the Act came into force, the Data Fiduciary must still issue a notice — as soon as reasonably practicable — informing the individual of the data held, its purpose, and how to exercise their rights going forward.
What Happens to Invalid Consent?
Any portion of consent that violates the Act is invalid to that extent. The rest of the consent may still hold — but the Data Fiduciary cannot rely on the invalid portion to justify processing.
A Practical Example
A food delivery app asks you to sign up. Before you proceed, it shows a notice stating: your name, phone number, and address will be used to deliver your orders. It tells you how to delete your account and who to contact for queries. You then tap “I Agree” — actively, not by default. That is valid consent.
If instead the app pre-ticks a box agreeing to share your data with advertising partners — that portion of consent is invalid.
Disclaimer
The contents of this post are intended for general awareness and informational purposes only. They do not constitute legal opinion, professional advice, consultancy, statutory interpretation, or a recommendation to act in any particular manner.
The Digital Personal Data Protection Act, 2023, related rules, notifications, regulatory guidance and judicial interpretations may evolve from time to time. The applicability of the law may also vary depending on the facts, sector, nature of data processing, organisational role, contractual terms and compliance framework.
Readers should not rely solely on this post for making legal, business, HR, technology, data-processing or compliance decisions. Specific advice from a qualified legal, privacy, cybersecurity, governance or compliance professional should be obtained before acting on any matter discussed.
The author / publisher shall not be responsible for any loss, liability, claim, penalty or consequence arising from reliance on the contents of this post without independent professional advice.
I’m a US Citizen Providing Services to Indians. Am I Covered Under the DPDP Act?
The short answer is — Yes, very likely.
The DPDP Act, 2023 is not limited to organisations or individuals based in India. Its reach is intentionally extraterritorial, designed to protect Indian individuals regardless of where the entity collecting their data is located.
What Does the Act Say?
The Act applies to the processing of digital personal data in two scenarios:
Within India — Any personal data collected in digital form (or digitised from non-digital form) within the territory of India.
Outside India — Any processing of digital personal data outside India, if such processing is in connection with offering goods or services to individuals in India.
This second provision is what covers you directly as a US-based service provider.
Does This Apply to Me?
Ask yourself these questions:
Do you collect personal data of individuals located in India? If yes — names, email addresses, phone numbers, payment details, usage behaviour — you are processing personal data of Indian Data Principals.
Do you offer goods or services to individuals in India? If your platform, app, or service is accessible to and targeted at Indian users — even if your servers are in the US — you fall within the scope of the Act.
Do you receive payment or registrations from Indian users? If Indian individuals are signing up, subscribing, or transacting with you, you are offering services to Data Principals within India.
If your answer to any of the above is yes, the DPDP Act applies to you.
What Are Your Obligations?
As a Data Fiduciary operating from outside India, you must:
Obtain free, informed, and unambiguous consent from Indian users before collecting their data
Provide a clear notice describing what data is collected and why
Use the data only for the stated purpose
Implement reasonable security safeguards to prevent breaches
Delete the data once the purpose is served or consent is withdrawn
Report breaches to the Data Protection Board of India and affected users promptly
Are There Any Restrictions on Sending Data Back to the US?
Yes, potentially. The Central Government has the power to restrict transfer of personal data to specific countries. If India notifies the US as a restricted destination, additional compliance steps may apply before you can transfer or store Indian users’ data on US servers.
What Happens If You Don’t Comply?
Non-compliance exposes you to penalties imposed by the Data Protection Board of India — up to ₹250 crore for security failures and up to ₹200 crore for failure to report a breach. The Board has jurisdiction over processing that affects Indian Data Principals, regardless of where you are based.
Disclaimer
The contents of this post are intended for general awareness and informational purposes only. They do not constitute legal opinion, professional advice, consultancy, statutory interpretation, or a recommendation to act in any particular manner.
The Digital Personal Data Protection Act, 2023, related rules, notifications, regulatory guidance and judicial interpretations may evolve from time to time. The applicability of the law may also vary depending on the facts, sector, nature of data processing, organisational role, contractual terms and compliance framework.
Readers should not rely solely on this post for making legal, business, HR, technology, data-processing or compliance decisions. Specific advice from a qualified legal, privacy, cybersecurity, governance or compliance professional should be obtained before acting on any matter discussed.
The author / publisher shall not be responsible for any loss, liability, claim, penalty or consequence arising from reliance on the contents of this post without independent professional advice.
A personal data breach under the DPDP Act, 2023 is any unauthorised processing, accidental disclosure, acquisition, sharing, use, alteration, destruction, or loss of access to personal data — that compromises its confidentiality, integrity, or availability.
In plain terms: if personal data ends up where it shouldn’t, gets changed without authorisation, or becomes inaccessible when it should be available — it is a breach.
How Does a Breach Happen?
Breaches can occur in many ways — through external attacks, internal negligence, or simple system failures.
Example 1 — Cyberattack A hospital’s patient database is hacked. Names, phone numbers, diagnoses, and medical histories of thousands of patients are stolen and published online. This is a breach of confidentiality.
Example 2 — Accidental Disclosure An HR executive accidentally emails salary slips of 500 employees to the wrong mailing list. The data was not stolen — but it was disclosed without authorisation. Still a breach.
Example 3 — Insider Threat A bank employee downloads and sells customer account details to a third party for personal gain. This is unauthorised processing — a serious breach.
Example 4 — Ransomware Attack A company’s servers are encrypted by ransomware. All customer data becomes inaccessible. Even though data was not stolen, loss of availability is a breach under the Act.
Example 5 — Third-Party Vendor Failure A Data Fiduciary shares customer data with a cloud service provider (Data Processor). The vendor suffers a security failure and the data is exposed. The Data Fiduciary remains accountable.
What Must a Data Fiduciary Do After a Breach?
The Act imposes a strict response obligation:
Notify immediately — Inform every affected Data Principal about the nature of the breach, its likely consequences, and the steps being taken to contain it.
Report to the Board — Intimate the Data Protection Board of India with full details: the root cause, timeline of events, persons responsible, mitigation measures taken, and steps to prevent recurrence.
What Are the Consequences?
Failure to implement safeguards that could have prevented the breach attracts a penalty of up to ₹250 crore. Failure to notify the Board or affected individuals attracts up to ₹200 crore — both imposed by the Data Protection Board.
The Key Takeaway
A breach is not just a hacking incident. Sending data to the wrong person, losing a device with unencrypted data, or a vendor’s server going down — all can qualify. The obligation to protect data, and to respond swiftly when things go wrong, rests squarely on the Data Fiduciary.
Disclaimer
The contents of this post are intended for general awareness and informational purposes only. They do not constitute legal opinion, professional advice, consultancy, statutory interpretation, or a recommendation to act in any particular manner.
The Digital Personal Data Protection Act, 2023, related rules, notifications, regulatory guidance and judicial interpretations may evolve from time to time. The applicability of the law may also vary depending on the facts, sector, nature of data processing, organisational role, contractual terms and compliance framework.
Readers should not rely solely on this post for making legal, business, HR, technology, data-processing or compliance decisions. Specific advice from a qualified legal, privacy, cybersecurity, governance or compliance professional should be obtained before acting on any matter discussed.
The author / publisher shall not be responsible for any loss, liability, claim, penalty or consequence arising from reliance on the contents of this post without independent professional advice.
If your organisation decides why and how personal data is collected and processed — you are a Data Fiduciary. This includes businesses, hospitals, schools, employers, apps, government bodies, and NGOs. Size does not matter; if you collect personal data of individuals in India, you qualify.
Q: Do I need consent before collecting data?
Yes. Before collecting any personal data, you must give the individual a clear notice describing what data is being collected and why. Consent must be free, specific, informed, and unambiguous.
Q: Can I collect more data than I need? No. You may only collect data that is necessary for the stated purpose. Collecting excess data is a violation of the Act.
Q: How long can I keep the data?
Only as long as the purpose requires. Once the purpose is served or the individual withdraws consent, you must delete the data — unless a law requires you to retain it for a specific period.
Q: What security measures must I put in place?
You must implement reasonable technical and organisational safeguards — including encryption, access controls, monitoring logs, and data backups — to prevent unauthorised access or breaches.
Q: What must I do if there is a data breach?
You must immediately notify the Data Protection Board and every affected individual, describing the nature of the breach, its likely impact, and the steps being taken to contain it.
Q: Must I have a grievance mechanism?
Yes. Every Data Fiduciary must establish an effective grievance redressal system and publish contact details of a person who can respond to Data Principal queries.
Q: What are the penalties for non-compliance?
Penalties can reach up to ₹250 crore for failure to implement security safeguards, and up to ₹200 crore for failure to report a breach — imposed by the Data Protection Board.
Disclaimer
The contents of this post are intended for general awareness and informational purposes only. They do not constitute legal opinion, professional advice, consultancy, statutory interpretation, or a recommendation to act in any particular manner.
The Digital Personal Data Protection Act, 2023, related rules, notifications, regulatory guidance and judicial interpretations may evolve from time to time. The applicability of the law may also vary depending on the facts, sector, nature of data processing, organisational role, contractual terms and compliance framework.
Readers should not rely solely on this post for making legal, business, HR, technology, data-processing or compliance decisions. Specific advice from a qualified legal, privacy, cybersecurity, governance or compliance professional should be obtained before acting on any matter discussed.
The author / publisher shall not be responsible for any loss, liability, claim, penalty or consequence arising from reliance on the contents of this post without independent professional advice.
You are a Data Principal if you are the individual whose personal data is being collected. Every time you fill a form, sign up for an app, or share your details with a business or government portal — you are the Data Principal. You have the right to access, correct, erase your data, and withdraw consent at any time.
Q: What is a Data Fiduciary?
You are a Data Fiduciary if your organisation decides why and how personal data is collected and used. Banks, hospitals, e-commerce platforms, employers, and government bodies are all Data Fiduciaries. You must obtain consent, give prior notice, secure the data, report breaches, and delete data once the purpose is served.
Q: What is a Data Processor?
You are a Data Processor if you handle personal data on behalf of a Data Fiduciary — under a contract, not on your own initiative. Cloud providers, payroll vendors, and IT service companies typically fall here. You follow instructions; the Fiduciary remains legally accountable.
Q: What is a Significant Data Fiduciary?
You are a Significant Data Fiduciary (SDF) if the Central Government notifies you as one — based on the volume or sensitivity of data you process, risk to individuals’ rights, or implications for national security. As an SDF, you must additionally appoint a Data Protection Officer (based in India), conduct annual Data Protection Impact Assessments, and undergo independent data audits.
Q: Can I be more than one?
Yes. A company can be both a Data Fiduciary (for its customers’ data) and a Data Processor (for data it handles on behalf of another business). Roles depend on context, not just who you are.
Q: What if I’m just an individual using data for personal purposes?
The Act does not apply to data processed purely for personal or domestic use. If you’re not collecting data as part of a business or service, you fall outside the Act’s scope.
Disclaimer
The contents of this post are intended for general awareness and informational purposes only. They do not constitute legal opinion, professional advice, consultancy, statutory interpretation, or a recommendation to act in any particular manner.
The Digital Personal Data Protection Act, 2023, related rules, notifications, regulatory guidance and judicial interpretations may evolve from time to time. The applicability of the law may also vary depending on the facts, sector, nature of data processing, organisational role, contractual terms and compliance framework.
Readers should not rely solely on this post for making legal, business, HR, technology, data-processing or compliance decisions. Specific advice from a qualified legal, privacy, cybersecurity, governance or compliance professional should be obtained before acting on any matter discussed.
The author / publisher shall not be responsible for any loss, liability, claim, penalty or consequence arising from reliance on the contents of this post without independent professional advice.
The Digital Personal Data Protection Act, 2023 (No. 22 of 2023) is India’s landmark law governing how digital personal data is collected, stored, and used. Enacted on 11 August 2023, it strikes a balance between two equally important goals: protecting every individual’s right to privacy and enabling organisations to process data for legitimate purposes.
At its core, the Act sets clear rules — if you collect someone’s data, you must have their consent, use it only for the stated purpose, keep it secure, and delete it when you no longer need it. Any breach of these rules can attract penalties of up to ₹250 crore.
The Act is India’s answer to a digital economy where personal data — names, phone numbers, health records, financial information — flows constantly between individuals, businesses, and government systems.
Who’s Involved?
The Act defines five key players in every data transaction:
Data Principal — The individual whose personal data is being collected. They are the owner of their data, with full rights to access it, correct it, and demand its deletion. For minors (under 18), their parents or guardians act on their behalf.
Data Fiduciary — Any organisation or person that decides why and how personal data is processed. Think banks, hospitals, e-commerce platforms, HR departments, or any app that collects your information. They carry the heaviest compliance responsibilities.
Data Processor — An entity that processes data on behalf of a Data Fiduciary under a contract. For example, a cloud service provider or a payroll processing company. They act on instructions — the Fiduciary remains accountable.
Consent Manager — A registered intermediary that lets individuals manage all their consents in one place — giving, reviewing, and withdrawing consent across multiple platforms through a single interoperable interface.
Significant Data Fiduciary (SDF) — A Data Fiduciary flagged by the Central Government as high-risk due to the volume or sensitivity of data they handle. They face additional obligations: appointing a Data Protection Officer (DPO) based in India, conducting annual Data Protection Impact Assessments (DPIAs), and undergoing independent audits.
The Data Protection Board of India — The regulatory authority that investigates complaints, adjudicates breaches, and imposes penalties. Appeals against its decisions go to the Telecom Disputes Settlement and Appellate Tribunal (TDSAT).
Is it Applicable to Me?
If you are an individual — Yes, as a Data Principal. Every time you share your name, phone number, email, or any other personal information with an app, a website, a hospital, or a government portal, the DPDP Act protects you. You have the right to know what data is collected, ask for corrections, demand erasure, and withdraw your consent. You also have duties — you must not submit false information or file frivolous complaints.
If you are a business or organisation — Yes, as a Data Fiduciary, if you collect or process digital personal data of individuals in India. This applies whether you are a startup, an enterprise, an NGO, or a government body. The Act applies to you even if you are based outside India, as long as you offer goods or services to people in India.
If you are a vendor or service provider — Yes, as a Data Processor, if you handle personal data on behalf of a client organisation. You must operate under a valid contract and implement appropriate security safeguards.
The Act does not apply to data processed purely for personal or domestic use, or to data that has already been made publicly available by the individual themselves.
In short — if your work or daily life involves digital personal data in any way, the DPDP Act is relevant to you.
Disclaimer
The contents of this post are intended for general awareness and informational purposes only. They do not constitute legal opinion, professional advice, consultancy, statutory interpretation, or a recommendation to act in any particular manner.
The Digital Personal Data Protection Act, 2023, related rules, notifications, regulatory guidance and judicial interpretations may evolve from time to time. The applicability of the law may also vary depending on the facts, sector, nature of data processing, organisational role, contractual terms and compliance framework.
Readers should not rely solely on this post for making legal, business, HR, technology, data-processing or compliance decisions. Specific advice from a qualified legal, privacy, cybersecurity, governance or compliance professional should be obtained before acting on any matter discussed.
The author / publisher shall not be responsible for any loss, liability, claim, penalty or consequence arising from reliance on the contents of this post without independent professional advice.
I’m Sharing My Data — What Are My Rights & Duties?
Q: Do I have the right to know what data is collected about me?
Yes. You can request a summary of your personal data being processed, the purpose for which it is used, and the names of all organisations it has been shared with.
Q: Can I correct my data if it’s wrong?
Yes. You have the right to correct inaccurate or incomplete data and to erase data that is no longer needed for the purpose it was collected.
Q: Can I take back my consent?
Yes — at any time. Withdrawing consent must be as easy as giving it. Once withdrawn, the organisation must stop processing your data within a reasonable time.
Q: What if I have a complaint?
Every Data Fiduciary must provide a readily available grievance redressal mechanism. If unresolved, you can escalate your complaint to the Data Protection Board of India.
Q: What happens to my data if I die or become incapacitated?
You can nominate another individual in advance to exercise your data rights on your behalf in the event of your death or incapacity.
Q: Do I have any duties too?
Yes. Rights come with responsibilities. As a Data Principal you must not:
Provide false or impersonated information
Suppress material information when applying for government documents or benefits
File false or frivolous complaints with a Data Fiduciary or the Board
Submit unverifiable information when requesting correction or erasure
Breach of these duties can attract a penalty of up to ₹10,000.
Q: Can a child exercise these rights?
A minor’s rights are exercised by their parent or lawful guardian. Organisations must obtain verifiable parental consent before collecting a child’s data, and cannot track, monitor, or target advertising at children.
The Prime Minister’s appeal for enabling Work from Home, hybrid working arrangements, online meetings and reduced travel wherever operationally feasible deserves a positive response from employers. WFH is not merely a welfare measure — it reduces urban congestion, carbon footprint, and commuting costs, and can demonstrably improve productivity when implemented thoughtfully.
Employers who embrace WFH should do so, however, through a written WFH policy that is legally reviewed and addresses the full spectrum of applicable Indian law — including Indian labour laws (Factories Act, Industrial Disputes Act, Shops and Establishments Acts as applicable to the state of operation), employment contracts, standing orders where applicable, POSH obligations (Prevention of Sexual Harassment at the Workplace Act, 2013), applicable social security obligations (EPF, ESI, gratuity), and the Digital Personal Data Protection Act, 2023.
The DPDP dimension of Work from Home, particularly in relation to what employee monitoring an employer may or may not lawfully conduct, forms the subject matter of this advisory. Although the DPDP framework is being implemented in phases and is expected to become fully effective from May 13, 2027, organisations should use the transition period to align their WFH monitoring practices with the principles of notice, purpose limitation, data minimisation, proportionality, security and employee privacy.
Setting the Legal Framework
In the DPDP context, the employer is the Data Fiduciary — the entity that determines the purpose and means of processing personal data (Section 2(i), DPDP Act). The employee is the Data Principal — the individual to whom the personal data relates (Section 2(j)). Every monitoring mechanism deployed in a WFH environment involves the collection and processing of the employee’s personal data, and in several cases, personal data of other household members who are entirely outside the employment relationship.
The key lawful basis available to employers under the DPDP Act for processing employee data without requiring consent is Section 7(i) — which permits processing for purposes of employment or those related to safeguarding the employer from loss or liability, such as prevention of corporate espionage, maintenance of confidentiality of trade secrets, intellectual property, classified information, or provision of any service or benefit sought by a Data Principal who is an employee.
This is a legitimate use — but it is not a blanket surveillance licence. It is bounded by two fundamental constraints that apply throughout the DPDP Act.
The first is data minimisation and necessity: under Section 6(1), processing must be limited to personal data that is necessary for the specified purpose. The same principle is expressed in the Second Schedule to the DPDP Rules, which requires processing to be limited to such personal data as is necessary for such uses or achieving such purposes.
The second is proportionality: the Puttaswamy judgment (Justice K.S. Puttaswamy (Retd.) vs Union of India, 2018, included in the project knowledge base) lays down the constitutional standard — a limitation of a fundamental right is permissible only if it is designated for a proper purpose, the measures are rationally connected to that purpose, the measures are necessary in that there are no alternative measures that may similarly achieve the same purpose with a lesser degree of limitation, and there is a proper balance between the importance of the purpose and the importance of the right being limited. This proportionality doctrine — not merely statutory compliance — governs what an employer may permissibly do in a WFH monitoring context.
Issue 1 — Screen Activity Monitoring and Recording
What is collected: Application usage, websites visited, documents accessed, communications composed, work patterns, keystroke sequences, and browsing behaviour.
DPDP Analysis:
Screen monitoring limited to work-device activity during designated working hours has a defensible basis under Section 7(i) for the purpose of preventing corporate data leakage, protecting trade secrets, and ensuring compliance with confidentiality obligations. However, the following distinctions are critical.
Monitoring which applications are open, whether corporate systems are being accessed, and whether DLP alerts are triggered — these are proportionate to the stated purpose. They constitute access logs and activity metadata, not a reproduction of content.
Full-day screen recording — capturing every pixel of everything displayed on an employee’s screen continuously — is disproportionate. It records personal communications, personal browser activity, medical or financial information briefly displayed, and the content of confidential client communications. The Puttaswamy proportionality test asks whether there is a less intrusive measure that achieves the same purpose. The answer here is yes — DLP alerts, access logs, and application monitoring achieve the security objective without wholesale content surveillance.
Keystroke logging is particularly invasive. It captures passwords, personal messages, draft communications subsequently deleted, and information entirely unrelated to work. No legitimate employment purpose requires the capture of every keystroke. It fails the proportionality test.
Notice obligation: Even where Section 7(i) applies and consent is not required, Section 5 of the DPDP Act requires the employer to give the employee a notice describing the personal data being collected and the purpose. A monitoring policy must be explicitly communicated in clear and plain language — not merely buried as a clause in an employment agreement signed on joining.
Assessment: Limited, purpose-specific screen activity monitoring is defensible. Full-day screen recording and keystroke logging are disproportionate and non-compliant.
Issue 2 — Webcam Open During Working Hours (Continuous Video Feed)
What is collected: Real-time visual data of the employee, their home environment, family members, and potentially other individuals who happen to be present.
DPDP Analysis:
A continuously open webcam during working hours collects several distinct categories of personal data simultaneously, each with a distinct legal problem.
The employee’s own image is personal data under Section 2(t). Continuous live video of a person in their home environment — revealing health indicators, emotional state, domestic circumstances, and living conditions — goes well beyond what is necessary to verify that work is being performed. Login activity, task completion, and access logs establish work activity without visual surveillance of the employee’s home.
Far more seriously, the webcam captures the personal data of household members — family, children, domestic workers — who are entirely outside the employment relationship and have given no consent whatsoever. They are Data Principals under the Act with full rights. The employer has no lawful basis — neither consent under Section 6 nor legitimate use under Section 7(i) — to collect their personal data. Section 7(i) is expressly limited to employment purposes; it does not extend to surveillance of third parties who happen to share the employee’s home.
The presence of children in the webcam feed creates an additional dimension. Section 9 of the DPDP Act requires verifiable parental consent before processing the personal data of a child under eighteen years of age. A household webcam that captures a child in the background is processing a child’s personal data without any consent mechanism at all.
Assessment: Continuous webcam monitoring during working hours is legally indefensible under the DPDP Act. It collects the personal data of third parties who have given no consent, captures children’s data in violation of Section 9, and fails the proportionality test even for the employee’s own data. This practice must be discontinued.
Issue 3 — Periodic Photographs Captured via Webcam
What is collected: Timestamped still facial images of the employee at regular automated intervals.
DPDP Analysis:
A facial photograph is unambiguously personal data. Periodic automated facial photographs, particularly when used for identity verification or attendance confirmation, constitute biometric data processing. The Puttaswamy judgment extensively addresses biometric data protection internationally, noting that facial scans require explicit consent and robust safeguards — principles consistent with the DPDP Act’s consent framework.
If the employer relies on Section 7(i), it must demonstrate that periodic facial photograph capture is necessary to prevent corporate espionage or safeguard trade secrets. This argument is very difficult to sustain. Login event logs, VPN connection records, and access logs establish that an authorised employee is operating the device — without requiring the capture and storage of biometric-level facial images at regular intervals.
If the employer instead relies on employee consent under Section 6, that consent must be free, specific, informed, unconditional, and unambiguous. The Puttaswamy judgment and DPDP consent jurisprudence recognise that consent given in an employment context — where an employee may fear job loss for withholding consent — is of questionable freedom. Consent embedded in a joining formality cannot satisfy the “free” requirement of Section 6(1).
The retention of these timestamped photographs over months or years represents a significant and growing database of biometric personal data, subject to Rule 8’s erasure obligations and Rule 6’s security requirements.
Assessment: The highest-risk element of WFH monitoring. Periodic facial photograph capture via webcam is biometric data processing that fails both the proportionality test under Section 7(i) and the free consent standard under Section 6(1). This practice carries acute DPDP compliance risk and must be discontinued.
Issue 4 — IP Address Capture and Tracking
What is collected: The employee’s home IP address, from which approximate residential location and internet service provider can be derived.
DPDP Analysis:
An IP address is personal data under Section 2(t). A home IP address additionally reveals information about the employee’s private residence. Two distinct use cases must be separated.
IP address for access authentication — verifying that a VPN or corporate system connection originates from an authorised device — has a clear and defensible basis under Section 7(i). This is consistent with the CERT-In Elemental Cyber Defense Controls (ACIM.1, ACIM.2), which require unique user IDs and role-based access controls. Using IP as an authentication signal in this context is proportionate.
Continuous tracking and storage of home IP addresses over time — correlating them to build location patterns, monitoring for address changes, or retaining them as a surveillance dataset — has no proportionate justification under the legitimate use basis. It constitutes location surveillance of an employee’s private residence, with no rational connection to the purpose of protecting trade secrets or corporate data.
Additionally, an employer who stores a database of all employees’ home IP addresses creates a significant breach risk. If that database is compromised, the residential network details of every WFH employee are exposed — creating a real-world security risk for the employees themselves that the employer, as Data Fiduciary, is responsible for preventing under Rule 6(1).
Assessment: IP capture for authentication is permissible. Continuous tracking, profiling, or retention of home IP addresses as a surveillance instrument is disproportionate and non-compliant.
Issue 5 — Aggregation and Profiling
What is collected: The combination of screen data + webcam feed + facial photographs + keystroke logs + IP address + application usage = a comprehensive behavioural profile of the employee in their home environment.
DPDP Analysis:
Each element of the monitoring regime creates a data exposure. But the aggregation of these elements into a combined profile is qualitatively more invasive than any single element. The Puttaswamy judgment specifically addresses informational privacy as a distinct constitutional right — the right to control what information about oneself is collected, aggregated, and used by others.
The aggregated profile reveals not just work activity but health patterns (movement in webcam), domestic relationships (household members visible), financial circumstances (home environment), and psychological state (work patterns and response times). None of this is necessary for employment management. It is behavioural surveillance that goes far beyond the employer’s legitimate interests under Section 7(i).
Assessment: Behavioural profiling through data aggregation in a WFH context has no valid lawful basis under the DPDP Act. It is disproportionate, exceeds any employment purpose, and constitutes a systematic interference with the employee’s informational privacy.
What the Employer Should Do — The Permissible Framework
Employers have a genuine and legitimate interest in protecting confidential information, trade secrets, intellectual property, customer data, and business systems. The DPDP Act fully recognises this under Section 7(i). The question is not whether to protect these interests but how — through measures that are proportionate, transparent, and respectful of the employee’s privacy in their home.
Recommended measures that are defensible under Section 7(i):
Secure VPN access — all WFH connections should be routed through a corporate VPN, ensuring encrypted transmission and access authentication without exposing the employee’s home network. MFA for all corporate systems, particularly those accessing customer data, financial records, or intellectual property. Role-based access control (RBAC) ensuring employees can access only the data their role requires. Access logs and audit trails showing which systems were accessed, when, and by whom — without recording the content of what was viewed. DLP (Data Leakage Prevention) alerts triggered when confidential data is transmitted outside authorised channels — flagging the event, not recording all activity. Device management policies governing corporate devices used for WFH, including remote wipe capability in the event of loss or separation. Task-based performance review and deliverable tracking — the most privacy-preserving and effective form of WFH productivity management, requiring no personal data beyond work outputs.
Measures that should not be deployed without strict necessity, proportionality, explicit notice, legal review, and a valid lawful basis:
Continuous webcam monitoring during working hours. Periodic automated facial photograph capture. Full-day screen recording. Keystroke logging. Home environment surveillance of any kind. Continuous home IP address tracking or location profiling. Behavioural profiling through data aggregation.
For any such measure that an employer nonetheless believes is strictly necessary for a specific, documented security purpose, the following conditions must all be met before deployment: a written documented justification of strict necessity, a proportionality assessment demonstrating no less intrusive alternative exists, a standalone clear and plain notice to employees specifying exactly what is collected, how it is stored, who can access it, for how long, and what rights the employee has, legal review of the lawful basis, defined retention and erasure schedules under Rule 8, and a grievance mechanism under Section 13 through which employees can raise objections.
The Preferred WFH Governance Model
The preferred model for WFH governance under DPDP is trust-based, deliverable-based, and security-led — not surveillance-led.
Trust-based: employees are presumed to be performing their duties unless there is a specific, documented reason to investigate otherwise. Surveillance is not a substitute for management.
Deliverable-based: performance is measured by outputs, outcomes, and quality of work — not by hours of screen visibility or number of keystrokes. This is both more legally defensible and more effective at driving actual productivity.
Security-led: the employer’s legitimate concerns about data security and IP protection are addressed through robust technical controls — VPN, MFA, RBAC, DLP, access logs — that protect the organisation’s assets without surveilling employees’ homes.
Summary Assessment Table
WFH Practice
Lawful Basis Available
DPDP Status
Recommendation
Secure VPN access
Section 7(i) ✅
Compliant
Implement as standard
MFA for all systems
Section 7(i) ✅
Compliant
Implement as standard
Role-based access control
Section 7(i) ✅
Compliant
Implement as standard
Access logs and audit trails
Section 7(i) ✅
Compliant
Implement with retention schedule
DLP alerts (not full recording)
Section 7(i) ✅
Compliant
Implement as standard
Task/deliverable-based review
No personal data
Preferred model
Implement as primary PM tool
Limited screen activity monitoring
Section 7(i) — conditional
Defensible if scoped
Limit to access metadata only
Full-day screen recording
No valid basis
Non-compliant
Do not deploy
Keystroke logging
No valid basis
Non-compliant
Do not deploy
Continuous webcam monitoring
No valid basis
Non-compliant
Do not deploy
Periodic webcam photographs
No valid basis
Non-compliant — biometric data
Do not deploy
Home IP continuous tracking
No valid basis
Non-compliant
Authentication use only
Behavioural profiling
No valid basis
Non-compliant
Do not deploy
The Bottom Line
WFH is now policy-positive, PM-endorsed, and legally permissible. It is a legitimate, productivity-enhancing, environmentally responsible mode of working that Indian employers should embrace wherever operationally feasible.
The monitoring architecture that accompanies it, however, must be minimal, transparent, proportionate, labour-law compliant, and privacy-preserving.
An employer who enables WFH while deploying continuous webcam surveillance, periodic facial photographs, keystroke logging, and full-day screen recording has not implemented WFH — it has installed a surveillance apparatus inside the employee’s home. That is not what the PM’s appeal envisioned, it is not what good management requires, and it is not what Indian law — including the DPDP Act — permits.
The right model is this: protect the organisation’s data and systems through robust technical controls. Measure performance through outputs and outcomes. Trust the employee with the dignity of their private home environment. And build the WFH policy on a legal foundation that will withstand scrutiny — from the Data Protection Board, from employees who know their rights, and from a workforce that will perform best when it is trusted rather than surveilled.
Source note: All DPDP analysis is derived from the Digital Personal Data Protection Act, 2023 (Sections 2(i), 2(j), 2(t), 2(u), 5, 6, 7(i), 8, 9, 12, 13), the DPDP Rules, 2025 (Rules 6, 8, Second Schedule), the Justice K.S. Puttaswamy (Retd.) vs Union of India Supreme Court judgment (2018) on the right to privacy and proportionality doctrine, and the CERT-In Elemental Cyber Defense Controls for MSMEs (Version 1.0, dated 01.09.2025). References to labour law, POSH, EPF, ESI and Shops and Establishments requirements.
Disclaimer
This note is for general informational purposes only and should not be treated as legal opinion or professional advice. The applicability of Work from Home, employee monitoring, DPDP Act requirements, and labour law obligations may vary based on the facts, sector, state laws, employment terms, and internal policies.
Organisations should obtain a specific legal opinion from a qualified advocate / labour-law expert / data-protection professional before implementing or relying on any Work from Home or employee monitoring policy.