Login | Register

Domain Adaptation Methods for Sparse Coding Based Non-Intrusive Load Monitoring

Title:

Domain Adaptation Methods for Sparse Coding Based Non-Intrusive Load Monitoring

Chouchene, Skander (2024) Domain Adaptation Methods for Sparse Coding Based Non-Intrusive Load Monitoring. Masters thesis, Concordia University.

[thumbnail of Chouchene_MASc_S2024.pdf]
Preview
Text (application/pdf)
Chouchene_MASc_S2024.pdf - Accepted Version
Available under License Spectrum Terms of Access.
4MB

Abstract

Energy disaggregation, or Non-Intrusive Load Monitoring (NILM), is a technique that predicts the consumption levels of individual appliances from only the main signal in the building. Various methods have been proposed to solve this problem, including sparse coding (SC), which offers great advantages due to its ability to capture complex patterns in data. However, a challenging aspect of NILM is that data containing appliance-level information is scarce. Moreover, the houses that the models are tested on might be from a different population than the training data, thus resulting in a domain shift. Therefore, we need to develop approaches that are adapted to training data scarcity through the use of transfer learning (TL), also known as domain adaptation. In this research work, we explore domain adaptation approaches on SC models with the aim of discriminative energy disaggregation (DD). We compare 4 methods that employ TL, two of which are deep architectures, with 4 methods that do not employ it. In the second part of this thesis, we explore constraining NILM domain adaptation to a privacy-preserving Federated Learning framework. In this case, the NILM models are being trained in a framework that does not allow data to be shown to any model outside of the building's domain. This allows us to experiment with distributed methods in a more realistic setting, where user data is omitted from any third party. For this task, we propose 4 weighted federated domain adaptation methods. We also experiment with weighting methods that further protect the privacy of the user, resulting in a total of 12 approaches that we compared.

Divisions:Concordia University > Gina Cody School of Engineering and Computer Science > Concordia Institute for Information Systems Engineering
Item Type:Thesis (Masters)
Authors:Chouchene, Skander
Institution:Concordia University
Degree Name:M.A. Sc.
Program:Quality Systems Engineering
Date:6 March 2024
Thesis Supervisor(s):Amayri, Manar and Bouguila, Nizar
Keywords:NILM, Sparse Coding, Domain Adaptation, Transfer Learning, Federated Learning, Deep Sparse Coding
ID Code:993576
Deposited By: Skander Chouchene
Deposited On:05 Jun 2024 16:51
Last Modified:05 Jun 2024 16:51

References:

[1] S. Mari, G. Bucci, F. Ciancetta, E. Fiorucci, and A. Fioravanti, “A review of non-intrusive load
monitoring applications in industrial and residential contexts,” Energies, vol. 15, no. 23, 2022.
[2] S. Ben-David, J. Blitzer, K. Crammer, A. Kulesza, F. Pereira, and J. Vaughan, “A theory of
learning from different domains,” Machine Learning, vol. 79, pp. 151–175, 05 2010.
[3] B. Sun, J. Feng, and K. Saenko, “Return of frustratingly easy domain adaptation,” Proceedings
of the AAAI Conference on Artificial Intelligence, vol. 30, 11 2015.
[4] P. A. Schirmer and I. Mporas, “Non-intrusive load monitoring: A review,” IEEE Transactions
on Smart Grid, vol. 14, no. 1, pp. 769–784, 2023.
[5] M. N. Schmidt, J. Larsen, and F.-T. Hsiao, “Wind noise reduction using non-negative sparse
coding,” in 2007 IEEE Workshop on Machine Learning for Signal Processing, pp. 431–436,
2007.
[6] M. Schmidt and R. Olsson, “Single-channel speech separation using sparse non-negative ma-
trix factorization,” 09 2006.
[7] A. Rahimpour, H. Qi, D. Fugate, and T. Kuruganti, “Non-intrusive energy disaggregation us-
ing non-negative matrix factorization with sum-to-k constraint,” IEEE Transactions on Power
Systems, vol. 32, no. 6, pp. 4430–4441, 2017.
[8] D. L. Donoho, “For most large underdetermined systems of linear equations the minimal ⋖1-
norm solution is also the sparsest solution,” Communications on Pure and Applied Mathemat-
ics, vol. 59, no. 6, pp. 797–829, 2006.
[9] J. Kolter, S. Batra, and A. Ng, “Energy disaggregation via discriminative sparse coding,” in Ad-
vances in Neural Information Processing Systems (J. Lafferty, C. Williams, J. Shawe-Taylor,
R. Zemel, and A. Culotta, eds.), Curran Associates, Inc.
[10] P. Hoyer, “Non-negative sparse coding,” in Proceedings of the 12th IEEE Workshop on Neural
Networks for Signal Processing, pp. 557–565, 2002.
[11] S. Rifai, P. Vincent, X. Muller, X. Glorot, and Y. Bengio, “Contractive auto-encoders: Explicit
invariance during feature extraction,” in Proceedings of the 28th International Conference on
International Conference on Machine Learning, ICML’11, (Madison, WI, USA), p. 833–840,
Omnipress, 2011.
[12] M. a. Ranzato, C. Poultney, S. Chopra, and Y. Cun, “Efficient learning of sparse representa-
tions with an energy-based model,” in Advances in Neural Information Processing Systems
(B. Sch¨olkopf, J. Platt, and T. Hoffman, eds.), vol. 19, MIT Press, 2006.
[13] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep convolu-
tional neural networks,” in Advances in Neural Information Processing Systems (F. Pereira,
C. Burges, L. Bottou, and K. Weinberger, eds.), vol. 25, Curran Associates, Inc., 2012.
[14] E. Jiang, Y. J. Zhang, and O. Koyejo, “Federated auto-weighted domain adaptation,” preprint
arXiv:2302.05049, 2023.
[15] L. Yang, J. Huang, W. Lin, and J. Cao, “Personalized federated learning on non-iid data via
group-based meta-learning,” ACM Trans. Knowl. Discov. Data, vol. 17, mar 2023.
[16] Canadian Association of Petroleum Producers (CAPP), “World energy needs,” [n.d.].
[17] E. Azizi, M. T. H. Beheshti, and S. Bolouki, “Quantification of disaggregation difficulty with
respect to the number of smart meters,” IEEE Transactions on Smart Grid, vol. 13, no. 1,
pp. 516–525, 2022.
[18] G. Hart, “Nonintrusive appliance load monitoring,” Proceedings of the IEEE, vol. 80, no. 12,
pp. 1870–1891, 1992.
[19] M. K. Akbar, M. Amayri, and N. Bouguila, “Deep learning based solution for appliance op-
erational state detection and power estimation in non-intrusive load monitoring,” in Advances
and Trends in Artificial Intelligence. Theory and Applications: 36th International Conference
on Industrial, Engineering and Other Applications of Applied Intelligent Systems, IEA/AIE
2023, Shanghai, China, July 19–22, 2023, Proceedings, Part II, (Berlin, Heidelberg), p. 59–65,
Springer-Verlag, 2023.
[20] M. K. Akbar, M. Amayri, and N. Bouguila, “A novel non-intrusive load monitoring technique
using semi-supervised deep learning framework for smart grid,” Building Simulation, vol. 17,
pp. 441–457, 03 2024.
[21] E. Elhamifar and S. Sastry, “Energy disaggregation via learning ’powerlets’ and sparse
coding,” in Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence,
AAAI’15, p. 629–635, AAAI Press, 2015.
[22] S. Singh and A. Majumdar, “Analysis co-sparse coding for energy disaggregation,” IEEE
Transactions on Smart Grid, vol. 10, no. 1, pp. 462–470, 2019.
[23] S. Singh and A. Majumdar, “Deep sparse coding for non–intrusive load monitoring,” IEEE
Transactions on Smart Grid, vol. 9, no. 5, pp. 4669–4678, 2018.
[24] Z. Zhou, Y. Xiang, H. Xu, Y. Wang, and D. Shi, “Unsupervised learning for non-intrusive load
monitoring in smart grid based on spiking deep neural network,” Journal of Modern Power
Systems and Clean Energy, vol. 10, no. 3, pp. 606–616, 2022.
[25] J. Edmonds and Z. S. Abdallah, “Img-nilm: A deep learning nilm approach using energy
heatmaps,” in Proceedings of the 38th ACM/SIGAPP Symposium on Applied Computing, SAC
’23, (New York, NY, USA), p. 1151–1153, Association for Computing Machinery, 2023.
[26] P. Huber, A. Calatroni, A. Rumsch, and A. Paice, “Review on deep neural networks applied to
low-frequency nilm,” Energies, vol. 14, no. 9, 2021.
[27] Y. Zhang, G. Tang, Q. Huang, Y. Wang, X. Wang, and J. Lou, “Fednilm: Applying federated
learning to nilm applications at the edge,” 2021.
[28] M. D’Incecco, S. Squartini, and M. Zhong, “Transfer learning for non-intrusive load monitor-
ing,” IEEE Transactions on Smart Grid, vol. PP, pp. 1–1, 08 2019.
[29] H. Bousbiat, Y. Himeur, I. Varlamis, F. Bensaali, and A. Amira, “Neural load disaggregation:
Meta-analysis, federated learning and beyond,” Energies, vol. 16, no. 2, 2023.
[30] B. Taskar, V. Chatalbashev, D. Koller, and C. Guestrin, “Learning structured prediction mod-
els: A large margin approach,” in Proceedings of the 22nd International Conference on Ma-
chine Learning, ICML ’05, (New York, NY, USA), p. 896–903, Association for Computing
Machinery, 2005.
[31] M. Collins, “Discriminative training methods for hidden Markov models: Theory and ex-
periments with perceptron algorithms,” in Proceedings of the 2002 Conference on Empirical
Methods in Natural Language Processing (EMNLP 2002), pp. 1–8, Association for Computa-
tional Linguistics, July 2002.
[32] K. Fallah and C. J. Rozell, “Variational sparse coding with learned thresholding,” in Proceed-
ings of the 39th International Conference on Machine Learning (K. Chaudhuri, S. Jegelka,
L. Song, C. Szepesvari, G. Niu, and S. Sabato, eds.), vol. 162 of Proceedings of Machine
Learning Research, pp. 6034–6058, PMLR, 17–23 Jul 2022.
[33] H. Ishwaran and J. S. Rao, “Spike and slab variable selection: Frequentist and Bayesian strate-
gies,” The Annals of Statistics, vol. 33, no. 2, pp. 730 – 773, 2005.
[34] M. Long, G. Ding, J. Wang, J. Sun, Y. Guo, and P. S. Yu, “Transfer sparse coding for robust
image representation,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition,
pp. 407–414, 2013.
[35] A. Gretton, K. M. Borgwardt, M. J. Rasch, B. Sch¨olkopf, and A. Smola, “A kernel two-sample
test,” Journal of Machine Learning Research, vol. 13, no. 25, pp. 723–773, 2012.
[36] M. Zheng, J. Bu, C. Chen, C. Wang, L. Zhang, G. Qiu, and D. Cai, “Graph regularized sparse
coding for image representation,” IEEE Transactions on Image Processing, vol. 20, no. 5,
pp. 1327–1336, 2011.
[37] D. Murray, L. Stankovic, and V. Stankovic, “An electrical load measurements dataset of
united kingdom households from a two-year longitudinal study,” Scientific Data, vol. 4, no. 1,
p. 160122, 2017.
[38] K. Basu, Classification Techniques for Non-intrusive Load Monitoring and Prediction of Res-
idential Loads. PhD thesis, 11 2014.
[39] J. Lin, J. Ma, J. Zhu, and H. Liang, “Deep domain adaptation for non-intrusive load monitoring
based on a knowledge transfer learning network,” IEEE Transactions on Smart Grid, vol. 13,
no. 1, pp. 280–292, 2022.
[40] J. Kolter and M. Johnson, “Redd: A public data set for energy disaggregation research,” Artif.
Intell., vol. 25, 01 2011.
[41] P. A. Schirmer, I. Mporas, and M. Paraskevas, “Energy disaggregation using elastic matching
algorithms,” Entropy, vol. 22, no. 1, 2020.
[42] K. Carrie Armel, A. Gupta, G. Shrimali, and A. Albert, “Is disaggregation the holy grail of
energy efficiency? the case of electricity,” Energy Policy, vol. 52, pp. 213–234, 2013. Special
Section: Transition Pathways to a Low Carbon Economy.
[43] M. K. Akbar, M. Amayri, N. Bouguila, F. Wurtz, and B. Delinchant, “Assessing the effective-
ness of supervised and semi-supervised nilm approaches in an industrial context,” in Proceed-
ings of the 2023 6th International Conference on Computational Intelligence and Intelligent
Systems, CIIS ’23, (New York, NY, USA), p. 7–13, Association for Computing Machinery,
2024.
[44] J. Wang, C. Lan, C. Liu, Y. Ouyang, T. Qin, W. Lu, Y. Chen, W. Zeng, and P. S. Yu, “General-
izing to unseen domains: A survey on domain generalization,” 2022.
[45] Q. Li, Z. Wen, Z. Wu, S. Hu, N. Wang, Y. Li, X. Liu, and B. He, “A survey on federated
learning systems: Vision, hype and reality for data privacy and protection,” IEEE Transactions
on Knowledge & Data Engineering, vol. 35, no. 04, pp. 3347–3366, 2023.
[46] K. Bonawitz, V. Ivanov, B. Kreuter, A. Marcedone, H. B. McMahan, S. Patel, D. Ramage,
A. Segal, and K. Seth, “Practical secure aggregation for privacy-preserving machine learning,”
in Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Se-
curity, CCS ’17, (New York, NY, USA), p. 1175–1191, Association for Computing Machinery,
2017.
[47] H. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-
efficient learning of deep networks from decentralized data,” in International Conference on
Artificial Intelligence and Statistics, 2016.
[48] Q. Li, J. Ye, W. Song, and Z. Tse, “Energy disaggregation with federated and transfer learn-
ing,” in 2021 IEEE 7th World Forum on Internet of Things (WF-IoT), pp. 698–703, 2021.
[49] S. Dai, F. Meng, Q. Wang, and X. Chen, “Federatednilm: A distributed and privacy-preserving
framework for non-intrusive load monitoring based on federated deep learning,” in 2023 In-
ternational Joint Conference on Neural Networks (IJCNN), pp. 01–08, 2023.
[50] S. Chouchene, M. Amayri, and N. Bouguila, “Federated learning based sparse coding for non-
intrusive load monitoring,” in Proceedings of the IEEE International Conference on Human-
Machine Systems (IEEE ICHMS), 2024, forthcoming.
[51] K. Hu, Y. Li, S. Zhang, J. Wu, S. Gong, S. Jiang, and L. Weng, “Fedmmd: A federated
weighting algorithm considering non-iid and local model deviation,” Expert Systems with Ap-
plications, vol. 237, p. 121463, 2024.
[52] J.-P. Vert, K. Tsuda, and B. Sch¨olkopf, “A Primer on Kernel Methods,” in Kernel Methods in
Computational Biology, The MIT Press, 07 2004.
[53] M. K. Akbar, M. Amayri, N. Bouguila, B. Delinchant, and F. Wurtz, “Evaluation of regression
models and bayes-ensemble regressor technique for non-intrusive load monitoring,” Sustain-
able Energy, Grids and Networks, vol. 38, p. 101294, 2024.
[54] S. Chouchene, “NILM Transfer Sparse Coding.” https://github.com/skalexch/
NILM_Transfer_Sparse_Coding.
[55] S. Chouchene, “FedDomainAdaptation.” https://github.com/skalexch/
FedDomainAdaptation.
[56] D. Lee and H. S. Seung, “Algorithms for non-negative matrix factorization,” in Advances in
Neural Information Processing Systems (T. Leen, T. Dietterich, and V. Tresp, eds.), vol. 13,
MIT Press, 2000.
[57] J. MacQueen, “Classification and analysis of multivariate observations,” in 5th Berkeley Symp.
Math. Statist. Probability, pp. 281–297, University of California Los Angeles LA USA, 1967.
All items in Spectrum are protected by copyright, with all rights reserved. The use of items is governed by Spectrum's terms of access.

Repository Staff Only: item control page

Downloads per month over past year

Research related to the current document (at the CORE website)
- Research related to the current document (at the CORE website)
Back to top Back to top