
Jason Durrett
New Member
Members

Forum Posts: 1
Member Since:
August 17, 2022
August 17, 2022

Hello Forum,
I have a large data set which is recorded in hundredths of seconds. Realistically I only need a record every second. So 99% of the data is excessive.
Initial Data Set Snippet attached:
432,000 rows / 101 columns
Goal:
4,320 rows / 101 columns
A) What is an effective method of keeping the whole number data per second, (i.e. 0, 1, 2, 3, 4, ... 4,320) while deleting or dis-including non-whole number data (i.e. 0.01, 0.02, 0.03, 0.04, 0.05, ... 4,319.99)?

Jessica Stewart
Northern USA
Member
Members

Trusted Members

Trusted Members

Forum Posts: 219
Member Since:
February 13, 2021
February 13, 2021

How is the data captured? Could you use Power Query to import and filter the data? If you get stuck please upload a sample copy of your file with a mock up of the desired result.
Forum Timezone: Australia/Brisbane
Most Users Ever Online: 245
Currently Online:
Guest(s) 9
Currently Browsing this Page:
1 Guest(s)
1 Guest(s)
Top Posters:
SunnyKow: 1432
Anders Sehlstedt: 880
Purfleet: 414
Frans Visser: 346
David_Ng: 306
lea cohen: 237
Jessica Stewart: 219
A.Maurizio: 213
Aye Mu: 201
jaryszek: 183
Newest Members:
Trevor Pindling
Stevan Kanjo
Erin Sheldon
Nikita Bhatia
Sheilah Taylor
Clare Webber
David Jenssen
Dominic Brosnahan
Young You
Jennifer Owens
Forum Stats:
Groups: 3
Forums: 24
Topics: 6524
Posts: 28554
Member Stats:
Guest Posters: 49
Members: 32808
Moderators: 2
Admins: 4
Administrators: Mynda Treacy, Philip Treacy, Catalin Bombea, FT
Moderators: Velouria, Riny van Eekelen
© Simple:Press —
