Jason Durrett
New Member
Members
Forum Posts: 1
Member Since:
August 17, 2022
August 17, 2022
Offline
Hello Forum,
I have a large data set which is recorded in hundredths of seconds. Realistically I only need a record every second. So 99% of the data is excessive.
Initial Data Set Snippet attached:
432,000 rows / 101 columns
Goal:
4,320 rows / 101 columns
A) What is an effective method of keeping the whole number data per second, (i.e. 0, 1, 2, 3, 4, ... 4,320) while deleting or dis-including non-whole number data (i.e. 0.01, 0.02, 0.03, 0.04, 0.05, ... 4,319.99)?
Jessica Stewart
Northern USA
Member
Members
Trusted Members
Trusted Members
Forum Posts: 219
Member Since:
February 13, 2021
February 13, 2021
Offline
How is the data captured? Could you use Power Query to import and filter the data? If you get stuck please upload a sample copy of your file with a mock up of the desired result.
Forum Timezone: Australia/Brisbane
Most Users Ever Online: 245
Currently Online:
Guest(s) 8
Currently Browsing this Page:
1 Guest(s)
1 Guest(s)
Top Posters:
Catalin Bombea: 1939
SunnyKow: 1432
Anders Sehlstedt: 929
Purfleet: 415
Hans Hallebeek: 354
Frans Visser: 349
David_Ng: 312
lea cohen: 248
Jessica Stewart: 219
A.Maurizio: 216
Newest Members:
EXerOrqW PRqmMlMeyNd
EZKeMKxZBN iWcVrtJiiOmoVRc
Pamela Malthouse
Lila Huron-Albinger
Harry Maxwell
Jie Liu
Susan Lanham
Dhanushka Rathnayake
Rena Towles
Wim Verhaege
Forum Stats:
Groups: 3
Forums: 25
Topics: 7093
Posts: 31099
Member Stats:
Guest Posters: 49
Members: 34528
Moderators: 2
Admins: 3
Administrators: Mynda Treacy, Philip Treacy, Jessica
Moderators: Velouria, Riny van Eekelen
© Simple:Press —