Jason Durrett
New Member
Members
Forum Posts: 1
Member Since:
August 17, 2022
August 17, 2022
Offline
Hello Forum,
I have a large data set which is recorded in hundredths of seconds. Realistically I only need a record every second. So 99% of the data is excessive.
Initial Data Set Snippet attached:
432,000 rows / 101 columns
Goal:
4,320 rows / 101 columns
A) What is an effective method of keeping the whole number data per second, (i.e. 0, 1, 2, 3, 4, ... 4,320) while deleting or dis-including non-whole number data (i.e. 0.01, 0.02, 0.03, 0.04, 0.05, ... 4,319.99)?
Jessica Stewart
Northern USA
Member
Members
Trusted Members
Trusted Members
Forum Posts: 219
Member Since:
February 13, 2021
February 13, 2021
Offline
How is the data captured? Could you use Power Query to import and filter the data? If you get stuck please upload a sample copy of your file with a mock up of the desired result.
Forum Timezone: Australia/Brisbane
Most Users Ever Online: 245
Currently Online: Catalin Bombea, Katie Campbell, Tony Johlie, MacNaughton Finance
Guest(s) 7
Currently Browsing this Page:
1 Guest(s)
1 Guest(s)
Top Posters:
Catalin Bombea: 1917
SunnyKow: 1432
Anders Sehlstedt: 900
Purfleet: 414
Frans Visser: 346
David_Ng: 306
Hans Hallebeek: 287
lea cohen: 246
Jessica Stewart: 219
A.Maurizio: 216
Newest Members:
Faraz Rafi
Anthony Blake
Jackb57
Ben Tooke
Kelsey Fennell
Jeffree Bunning
Jackie Memije
John Franzon
Darcie Clayton
trevor mir
Forum Stats:
Groups: 3
Forums: 24
Topics: 6872
Posts: 30122
Member Stats:
Guest Posters: 49
Members: 33672
Moderators: 2
Admins: 3
Administrators: Mynda Treacy, Philip Treacy, Jessica
Moderators: Velouria, Riny van Eekelen
© Simple:Press —