Apr 22 19:20:48.323765 ip-10-0-138-15 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:20:48.323778 ip-10-0-138-15 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:20:48.323788 ip-10-0-138-15 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:20:48.324102 ip-10-0-138-15 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:20:58.364284 ip-10-0-138-15 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:20:58.364308 ip-10-0-138-15 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot cb73c5a10ae442f0bebe8b4ebbd810a9 -- Apr 22 19:22:59.457218 ip-10-0-138-15 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:22:59.918902 ip-10-0-138-15 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:22:59.918902 ip-10-0-138-15 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:22:59.918902 ip-10-0-138-15 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:22:59.918902 ip-10-0-138-15 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:22:59.918902 ip-10-0-138-15 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:22:59.920555 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.920443 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:22:59.928415 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928391 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:22:59.928415 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928410 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:22:59.928415 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928414 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:22:59.928415 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928417 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:22:59.928415 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928420 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:22:59.928415 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928423 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928427 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928430 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928432 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928435 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928438 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928440 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928443 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928445 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928450 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928454 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928457 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928460 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928463 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928466 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928468 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928471 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928474 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928477 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:22:59.928653 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928479 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928482 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928484 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928487 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928490 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928492 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928504 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928507 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928510 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928512 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928515 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928518 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928520 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928523 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928525 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928529 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928532 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928535 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928538 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928540 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:22:59.929144 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928543 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928546 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928548 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928551 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928553 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928556 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928558 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928561 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928564 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928566 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928569 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928571 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928574 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928576 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928579 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928582 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928589 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928591 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928594 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928597 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:22:59.929649 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928601 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928605 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928608 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928610 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928614 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928617 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928619 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928623 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928626 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928628 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928631 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928634 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928637 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928639 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928643 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928646 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928649 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928651 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928654 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928657 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:22:59.930148 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928660 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.928663 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929084 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929091 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929094 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929097 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929099 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929102 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929105 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929108 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929112 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929115 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929118 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929120 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929123 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929125 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929128 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929131 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929133 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929137 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:22:59.930707 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929140 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929142 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929145 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929148 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929152 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929156 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929163 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929168 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929170 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929173 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929176 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929180 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929182 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929185 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929187 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929190 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929193 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929196 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929199 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929201 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:22:59.931237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929204 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929206 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929209 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929211 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929214 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929217 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929219 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929222 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929224 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929227 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929230 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929232 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929235 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929238 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929240 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929243 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929245 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929247 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929250 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929253 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:22:59.931731 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929255 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929258 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929260 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929264 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929266 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929268 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929271 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929274 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929277 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929280 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929282 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929285 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929287 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929290 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929292 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929295 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929297 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929300 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929302 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929305 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:22:59.932237 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929307 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929310 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929312 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929314 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929317 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929319 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929322 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.929325 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930697 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930706 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930713 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930718 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930723 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930727 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930731 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930736 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930740 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930743 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930746 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930750 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930754 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930757 2576 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:22:59.932744 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930760 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930763 2576 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930766 2576 flags.go:64] FLAG: --cloud-config="" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930769 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930772 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930777 2576 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930780 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930783 2576 flags.go:64] FLAG: --config-dir="" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930786 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930789 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930793 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930796 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930799 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930802 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930805 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930809 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930812 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930814 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930818 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930822 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930825 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930828 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930831 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930834 2576 flags.go:64] FLAG: --enable-server="true" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930837 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:22:59.933293 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930842 2576 flags.go:64] FLAG: --event-burst="100" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930845 2576 flags.go:64] FLAG: --event-qps="50" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930848 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930855 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930858 2576 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930862 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930865 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930868 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930871 2576 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930874 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930877 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930880 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930882 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930885 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930888 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930891 2576 flags.go:64] FLAG: --feature-gates="" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930895 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930898 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930901 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930905 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930908 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930912 2576 flags.go:64] FLAG: --help="false" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930915 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-138-15.ec2.internal" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930931 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:22:59.933886 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930935 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930938 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930941 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930945 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930947 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930950 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930953 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930957 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930960 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930963 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930965 2576 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930970 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930973 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930977 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930979 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930982 2576 flags.go:64] FLAG: --lock-file="" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930985 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930989 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930992 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.930997 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931000 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931003 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931006 2576 flags.go:64] FLAG: --logging-format="text" Apr 22 19:22:59.934476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931009 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931012 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931015 2576 flags.go:64] FLAG: --manifest-url="" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931018 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931022 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931025 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931029 2576 flags.go:64] FLAG: --max-pods="110" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931033 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931036 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931039 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931042 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931045 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931048 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931051 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931059 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931062 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931065 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931069 2576 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931072 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931077 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931080 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931084 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931089 2576 flags.go:64] FLAG: --port="10250" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931092 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:22:59.935084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931095 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0abe8dfc0697f8e01" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931098 2576 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931101 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931104 2576 flags.go:64] FLAG: --register-node="true" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931107 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931110 2576 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931114 2576 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931117 2576 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931120 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931123 2576 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931126 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931130 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931133 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931135 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931138 2576 flags.go:64] FLAG: --runonce="false" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931141 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931144 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931147 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931149 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931152 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931155 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931158 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931161 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931164 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931167 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931170 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:22:59.935661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931174 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931177 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931180 2576 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931184 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931191 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931194 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931197 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931202 2576 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931205 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931208 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931210 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931213 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931216 2576 flags.go:64] FLAG: --v="2" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931221 2576 flags.go:64] FLAG: --version="false" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931225 2576 flags.go:64] FLAG: --vmodule="" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931228 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931232 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931324 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931327 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931330 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931333 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931336 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931339 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931341 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:22:59.936329 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931344 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931347 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931349 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931352 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931354 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931357 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931360 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931362 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931365 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931368 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931371 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931374 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931378 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931380 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931383 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931385 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931388 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931390 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931393 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931396 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:22:59.936941 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931398 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931402 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931406 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931409 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931412 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931415 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931417 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931420 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931422 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931425 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931427 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931430 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931433 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931435 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931438 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931440 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931443 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931445 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931448 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:22:59.937447 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931450 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931453 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931456 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931459 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931463 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931469 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931472 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931475 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931477 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931480 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931482 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931485 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931488 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931490 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931493 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931495 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931498 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931500 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931503 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:22:59.937914 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931506 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931508 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931511 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931513 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931516 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931518 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931521 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931523 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931526 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931529 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931531 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931534 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931536 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931539 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931542 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931544 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931547 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931549 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931554 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931556 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:22:59.938410 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.931558 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.931567 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.938068 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.938086 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938132 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938137 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938141 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938144 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938147 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938150 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938152 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938156 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938158 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938161 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938164 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938166 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:22:59.938931 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938169 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938171 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938174 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938177 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938179 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938183 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938188 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938192 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938194 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938197 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938200 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938203 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938206 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938209 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938212 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938215 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938218 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938221 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938223 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:22:59.939381 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938227 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938230 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938232 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938235 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938237 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938240 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938243 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938245 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938248 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938250 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938253 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938255 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938258 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938261 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938264 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938266 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938269 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938272 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938275 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938277 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:22:59.939857 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938280 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938282 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938285 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938287 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938290 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938293 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938295 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938298 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938301 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938304 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938306 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938309 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938311 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938314 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938317 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938319 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938322 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938324 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938326 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938329 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:22:59.940361 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938331 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938334 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938336 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938339 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938341 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938344 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938346 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938349 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938352 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938356 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938359 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938362 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938364 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938367 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938369 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:22:59.940872 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.938375 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938472 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938476 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938479 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938482 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938485 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938488 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938491 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938494 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938496 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938499 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938502 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938505 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938508 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938511 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938513 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938516 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938518 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938521 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938523 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938526 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:22:59.941268 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938528 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938530 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938533 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938536 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938538 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938541 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938543 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938546 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938548 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938551 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938553 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938557 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938561 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938564 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938567 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938569 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938572 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938575 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938578 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:22:59.941778 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938581 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938583 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938587 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938590 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938593 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938596 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938599 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938602 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938605 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938607 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938610 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938612 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938615 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938617 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938620 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938622 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938625 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938627 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938630 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:22:59.942254 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938632 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938634 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938637 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938639 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938642 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938644 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938647 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938650 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938652 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938654 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938657 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938660 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938662 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938665 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938667 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938670 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938672 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938675 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938678 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938680 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:22:59.942721 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938683 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938685 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938687 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938691 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938693 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938696 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938698 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:22:59.938701 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.938706 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:22:59.943223 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.939447 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:22:59.943777 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.943763 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:22:59.944764 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.944751 2576 server.go:1019] "Starting client certificate rotation" Apr 22 19:22:59.944864 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.944849 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:22:59.944894 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.944887 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:22:59.969440 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.969424 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:22:59.971212 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.971195 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:22:59.989782 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.989759 2576 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:22:59.995965 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.995948 2576 log.go:25] "Validated CRI v1 image API" Apr 22 19:22:59.997324 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.997309 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:22:59.999734 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.999708 2576 fs.go:135] Filesystem UUIDs: map[4ac332d7-e45f-4c32-a647-c02b2ff0e78b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d2f9ab65-ef63-4f3a-84ec-ddd4d4138c63:/dev/nvme0n1p4] Apr 22 19:22:59.999818 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:22:59.999732 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:00.004967 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.004943 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:00.005301 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.005192 2576 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:00.003964095 +0000 UTC m=+0.426629393 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100355 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24e3f250bb7938945a1f081b4d1a16 SystemUUID:ec24e3f2-50bb-7938-945a-1f081b4d1a16 BootID:cb73c5a1-0ae4-42f0-bebe-8b4ebbd810a9 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:86:c6:cc:62:d3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:86:c6:cc:62:d3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:b8:08:f6:4b:d3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:00.005301 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.005298 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:00.005399 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.005388 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:00.007788 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.007762 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:00.007952 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.007791 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-15.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:00.008499 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.008488 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:00.008536 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.008502 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:00.008536 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.008516 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:00.008588 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.008549 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:00.010218 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.010207 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:00.010330 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.010321 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:00.013345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.013334 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:00.013392 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.013349 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:00.013392 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.013362 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:00.013392 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.013371 2576 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:00.013392 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.013379 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:00.014786 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.014773 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:00.014841 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.014793 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:00.017817 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.017803 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:00.021432 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.021411 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:00.023951 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.023932 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:00.023951 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.023954 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.023962 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.023968 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.023974 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.023980 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.024015 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.024024 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.024034 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.024040 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.024049 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:00.024065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.024057 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:00.025088 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.025077 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:00.025088 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.025088 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:00.028641 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.028628 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:00.028686 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.028664 2576 server.go:1295] "Started kubelet" Apr 22 19:23:00.028790 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.028750 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:00.028848 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.028763 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:00.028898 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.028834 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:00.029559 ip-10-0-138-15 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:00.030193 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.030075 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:00.030804 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.030786 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-15.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:00.030864 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.030847 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-15.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:00.030916 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.030862 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:00.031438 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.031424 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:00.037872 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.037853 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:00.037872 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.037869 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:00.038070 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.038032 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:00.038286 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.037272 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-15.ec2.internal.18a8c433dd0cccb5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-15.ec2.internal,UID:ip-10-0-138-15.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-15.ec2.internal,},FirstTimestamp:2026-04-22 19:23:00.028640437 +0000 UTC m=+0.451305734,LastTimestamp:2026-04-22 19:23:00.028640437 +0000 UTC m=+0.451305734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-15.ec2.internal,}" Apr 22 19:23:00.038592 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038571 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:00.038592 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038593 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:00.038731 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038676 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:00.038731 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038723 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:00.038731 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038732 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:00.038867 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038821 2576 factory.go:153] Registering CRI-O factory Apr 22 19:23:00.038867 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038841 2576 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:00.038981 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038908 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:00.038981 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038934 2576 factory.go:55] Registering systemd factory Apr 22 19:23:00.038981 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038943 2576 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:00.038981 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038965 2576 factory.go:103] Registering Raw factory Apr 22 19:23:00.038981 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.038979 2576 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:00.038981 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.038977 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.039855 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.039842 2576 manager.go:319] Starting recovery of all containers Apr 22 19:23:00.048873 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.048821 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:23:00.049191 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.049162 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-15.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:23:00.052557 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.052540 2576 manager.go:324] Recovery completed Apr 22 19:23:00.057109 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.057095 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:00.059907 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.059796 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:00.060010 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.059938 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:00.060010 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.059954 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:00.060497 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.060484 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:00.060497 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.060495 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:00.060582 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.060510 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:00.062513 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.062451 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-15.ec2.internal.18a8c433dee9e877 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-15.ec2.internal,UID:ip-10-0-138-15.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-15.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-15.ec2.internal,},FirstTimestamp:2026-04-22 19:23:00.059908215 +0000 UTC m=+0.482573520,LastTimestamp:2026-04-22 19:23:00.059908215 +0000 UTC m=+0.482573520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-15.ec2.internal,}" Apr 22 19:23:00.062928 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.062903 2576 policy_none.go:49] "None policy: Start" Apr 22 19:23:00.062960 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.062938 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:00.062960 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.062948 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:00.068876 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.068857 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m954l" Apr 22 19:23:00.071637 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.071575 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-15.ec2.internal.18a8c433deea7d05 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-15.ec2.internal,UID:ip-10-0-138-15.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-138-15.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-138-15.ec2.internal,},FirstTimestamp:2026-04-22 19:23:00.059946245 +0000 UTC m=+0.482611545,LastTimestamp:2026-04-22 19:23:00.059946245 +0000 UTC m=+0.482611545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-15.ec2.internal,}" Apr 22 19:23:00.078335 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.078321 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m954l" Apr 22 19:23:00.108489 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.108471 2576 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.108508 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.108520 2576 server.go:85] "Starting device plugin registration server" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.108774 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.108786 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.108857 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.108943 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.108951 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.109635 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:00.118673 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.109672 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.149417 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.149388 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:00.150511 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.150494 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:00.150604 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.150519 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:00.150604 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.150540 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:00.150604 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.150551 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:00.150756 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.150591 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:00.153797 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.153784 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:00.209186 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.209113 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:00.210112 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.210090 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:00.210203 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.210121 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:00.210203 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.210136 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:00.210203 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.210161 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.221204 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.221183 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.221269 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.221209 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-15.ec2.internal\": node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.247738 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.247710 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.251629 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.251609 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal"] Apr 22 19:23:00.251686 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.251668 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:00.253359 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.253345 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:00.253486 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.253374 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:00.253486 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.253390 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:00.254604 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.254592 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:00.254771 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.254757 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.254819 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.254784 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:00.255287 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.255272 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:00.255391 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.255300 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:00.255391 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.255316 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:00.255391 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.255275 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:00.255391 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.255367 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:00.255391 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.255377 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:00.256480 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.256465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.256562 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.256501 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:00.257202 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.257186 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:00.257262 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.257215 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:00.257262 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.257228 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:00.284674 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.284655 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-15.ec2.internal\" not found" node="ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.288972 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.288954 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-15.ec2.internal\" not found" node="ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.339979 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.339949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99b983d6e870652c0db1327874b8eda4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"99b983d6e870652c0db1327874b8eda4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.339979 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.339980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b983d6e870652c0db1327874b8eda4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"99b983d6e870652c0db1327874b8eda4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.340171 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.339997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e477e0c5a4e100c99b94e03d47d9bc3f-config\") pod \"kube-apiserver-proxy-ip-10-0-138-15.ec2.internal\" (UID: \"e477e0c5a4e100c99b94e03d47d9bc3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.348025 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.348006 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.440851 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.440820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99b983d6e870652c0db1327874b8eda4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"99b983d6e870652c0db1327874b8eda4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.440851 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.440850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b983d6e870652c0db1327874b8eda4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"99b983d6e870652c0db1327874b8eda4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.441063 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.440867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e477e0c5a4e100c99b94e03d47d9bc3f-config\") pod \"kube-apiserver-proxy-ip-10-0-138-15.ec2.internal\" (UID: \"e477e0c5a4e100c99b94e03d47d9bc3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.441063 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.440901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e477e0c5a4e100c99b94e03d47d9bc3f-config\") pod \"kube-apiserver-proxy-ip-10-0-138-15.ec2.internal\" (UID: \"e477e0c5a4e100c99b94e03d47d9bc3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.441063 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.440910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99b983d6e870652c0db1327874b8eda4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"99b983d6e870652c0db1327874b8eda4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.441063 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.440939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b983d6e870652c0db1327874b8eda4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal\" (UID: \"99b983d6e870652c0db1327874b8eda4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.448891 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.448865 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.549668 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.549586 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.586785 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.586749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.591473 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.591455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 22 19:23:00.650438 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.650404 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.750961 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.750936 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.851485 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.851411 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:00.944984 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.944958 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:00.945461 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:00.945093 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:00.952119 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:00.952091 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:01.038378 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.038344 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:01.053041 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:01.053020 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:01.055103 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:01.055076 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode477e0c5a4e100c99b94e03d47d9bc3f.slice/crio-2d08855a250d1b7445e523f9ad4625c8cf9c7a9710a56999633d02fdd47b230a WatchSource:0}: Error finding container 2d08855a250d1b7445e523f9ad4625c8cf9c7a9710a56999633d02fdd47b230a: Status 404 returned error can't find the container with id 2d08855a250d1b7445e523f9ad4625c8cf9c7a9710a56999633d02fdd47b230a Apr 22 19:23:01.057312 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.057294 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:01.059128 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:01.059104 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b983d6e870652c0db1327874b8eda4.slice/crio-7e762d42eac4d3aee09b728208879fc188d06af2af06a1f10c0aabcdb00651fa WatchSource:0}: Error finding container 7e762d42eac4d3aee09b728208879fc188d06af2af06a1f10c0aabcdb00651fa: Status 404 returned error can't find the container with id 7e762d42eac4d3aee09b728208879fc188d06af2af06a1f10c0aabcdb00651fa Apr 22 19:23:01.060234 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.060219 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:01.080678 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.080652 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:00 +0000 UTC" deadline="2027-12-18 16:33:00.337321254 +0000 UTC" Apr 22 19:23:01.080678 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.080675 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14517h9m59.256649378s" Apr 22 19:23:01.135279 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.135252 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hx8p2" Apr 22 19:23:01.145538 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.145514 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hx8p2" Apr 22 19:23:01.153108 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:01.153091 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:01.153429 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.153393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" event={"ID":"99b983d6e870652c0db1327874b8eda4","Type":"ContainerStarted","Data":"7e762d42eac4d3aee09b728208879fc188d06af2af06a1f10c0aabcdb00651fa"} Apr 22 19:23:01.154321 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.154302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" event={"ID":"e477e0c5a4e100c99b94e03d47d9bc3f","Type":"ContainerStarted","Data":"2d08855a250d1b7445e523f9ad4625c8cf9c7a9710a56999633d02fdd47b230a"} Apr 22 19:23:01.253941 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:01.253903 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-15.ec2.internal\" not found" Apr 22 19:23:01.322884 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.322853 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:01.338331 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.338311 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" Apr 22 19:23:01.344014 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.343987 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:01.365090 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.365060 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:01.366188 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.366163 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" Apr 22 19:23:01.385713 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.385642 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:01.516257 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:01.516215 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:02.015415 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.015385 2576 apiserver.go:52] "Watching apiserver" Apr 22 19:23:02.025149 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.025127 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:02.025572 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.025549 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-z4mnr","openshift-image-registry/node-ca-47nkl","openshift-multus/network-metrics-daemon-m8fmk","openshift-network-diagnostics/network-check-target-wrvx6","openshift-network-operator/iptables-alerter-qpgmr","openshift-ovn-kubernetes/ovnkube-node-w57k8","kube-system/konnectivity-agent-cznmn","kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal","openshift-multus/multus-additional-cni-plugins-2hbdj","openshift-multus/multus-jxhs2","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h"] Apr 22 19:23:02.028213 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.028189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.028321 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.028289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.029453 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.029429 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:02.029551 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.029528 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:02.030603 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.030582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:02.030694 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.030641 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:02.031200 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.031166 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xn94d\"" Apr 22 19:23:02.031200 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.031175 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:02.031200 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.031182 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:02.031366 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.031184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:02.031366 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.031333 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:02.031479 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.031459 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:02.031574 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.031463 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x97sx\"" Apr 22 19:23:02.031704 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.031685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.032990 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.032970 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.034104 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.034087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:02.035158 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.035133 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:02.035504 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.035483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.036581 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036080 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-d7xfq\"" Apr 22 19:23:02.036581 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036090 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:02.036581 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036104 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:02.036581 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036147 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:02.036581 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036461 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:02.036581 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:02.036581 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036572 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:02.036968 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036952 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vrcmd\"" Apr 22 19:23:02.037068 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036951 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:02.038078 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.036956 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:02.038078 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.037361 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.038216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.038112 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x9h4c\"" Apr 22 19:23:02.038267 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.038218 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:02.038299 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.038275 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rswps\"" Apr 22 19:23:02.038414 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.038373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:02.040216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.039383 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:02.040216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.039672 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:02.040216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.039735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:02.040216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.039910 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:02.040216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.039993 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:02.040498 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.040312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-t7m4l\"" Apr 22 19:23:02.040498 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.040395 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:02.041957 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.041935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.044789 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.044717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:02.044789 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.044722 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:02.044789 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.044739 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-z2btp\"" Apr 22 19:23:02.045000 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.044797 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:02.049375 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.049375 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovnkube-config\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.049523 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-conf-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.049523 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-sys-fs\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.049523 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-systemd\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.049655 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-iptables-alerter-script\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.049655 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-system-cni-dir\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.049655 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-os-release\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.049791 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-lib-modules\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.049791 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-socket-dir-parent\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.049791 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zrh6\" (UniqueName: \"kubernetes.io/projected/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-kube-api-access-8zrh6\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.049791 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-etc-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.049791 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-cnibin\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.050009 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-cni-bin\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.050009 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-socket-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.050009 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-sys\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.050009 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.050009 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-node-log\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.050009 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ee930469-602e-4383-9900-a97a25da678b-multus-daemon-config\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.050009 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-device-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.050009 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.049992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysctl-d\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-tuned\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-k8s-cni-cncf-io\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-netns\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-hostroot\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-run\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8jsd\" (UniqueName: \"kubernetes.io/projected/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-kube-api-access-f8jsd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-registration-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.050277 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysconfig\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-systemd-units\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-systemd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovn-node-metrics-cert\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-system-cni-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-var-lib-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-multus-certs\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-etc-kubernetes\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-tmp\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-host\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.050748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zs5q\" (UniqueName: \"kubernetes.io/projected/ee930469-602e-4383-9900-a97a25da678b-kube-api-access-9zs5q\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-host-slash\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-modprobe-d\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysctl-conf\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-var-lib-kubelet\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-env-overrides\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.050979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-os-release\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-kubernetes\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-log-socket\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-cni-multus\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-slash\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-cni-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee930469-602e-4383-9900-a97a25da678b-cni-binary-copy\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.051329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051175 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hskvc\" (UniqueName: \"kubernetes.io/projected/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-kube-api-access-hskvc\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvwmj\" (UniqueName: \"kubernetes.io/projected/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-kube-api-access-zvwmj\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-kubelet\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-host\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd77j\" (UniqueName: \"kubernetes.io/projected/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-kube-api-access-vd77j\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4edf430-4780-4b0d-b495-50534d4ddccc-agent-certs\") pod \"konnectivity-agent-cznmn\" (UID: \"f4edf430-4780-4b0d-b495-50534d4ddccc\") " pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppnzc\" (UniqueName: \"kubernetes.io/projected/280aa335-840b-490c-a36f-0cdef337ab79-kube-api-access-ppnzc\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfvz6\" (UniqueName: \"kubernetes.io/projected/957d9773-bf39-486e-a32e-eba60e7b49e9-kube-api-access-lfvz6\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-serviceca\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-ovn\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-cni-bin\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-cni-netd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovnkube-script-lib\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4edf430-4780-4b0d-b495-50534d4ddccc-konnectivity-ca\") pod \"konnectivity-agent-cznmn\" (UID: \"f4edf430-4780-4b0d-b495-50534d4ddccc\") " pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-cnibin\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-kubelet\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.051991 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.051649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-run-netns\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.139329 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.139301 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:02.147224 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.147187 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:01 +0000 UTC" deadline="2027-12-29 06:38:09.36416687 +0000 UTC" Apr 22 19:23:02.147224 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.147220 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14771h15m7.216950201s" Apr 22 19:23:02.152052 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.152175 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-k8s-cni-cncf-io\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.152175 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-netns\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.152175 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-hostroot\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.152175 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-k8s-cni-cncf-io\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.152175 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-run\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-netns\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8jsd\" (UniqueName: \"kubernetes.io/projected/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-kube-api-access-f8jsd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-hostroot\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-registration-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-run\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysconfig\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-systemd-units\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-systemd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-systemd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-systemd-units\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.152451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-registration-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.153070 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysconfig\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.153070 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovn-node-metrics-cert\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.153070 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.153070 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.153070 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152948 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:02.153303 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.152614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.153303 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-system-cni-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.153303 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:02.153303 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-var-lib-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.153303 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-multus-certs\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-etc-kubernetes\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.153372 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-tmp\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-host\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.153449 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:02.653418631 +0000 UTC m=+3.076083930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-host\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zs5q\" (UniqueName: \"kubernetes.io/projected/ee930469-602e-4383-9900-a97a25da678b-kube-api-access-9zs5q\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.153528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-var-lib-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/280aa335-840b-490c-a36f-0cdef337ab79-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-host-slash\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-system-cni-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-run-multus-certs\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-modprobe-d\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-etc-kubernetes\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysctl-conf\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-host-slash\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-var-lib-kubelet\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-env-overrides\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-os-release\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-modprobe-d\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.153989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-kubernetes\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.154810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-kubernetes\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.154810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.153992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-os-release\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.154810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.154009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-log-socket\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.154810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.154042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysctl-conf\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.154810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.154043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-var-lib-kubelet\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.154810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.154042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-cni-multus\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.154810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.154102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-log-socket\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.154810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.154137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-cni-multus\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.155660 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-env-overrides\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.155660 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-slash\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.155660 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-slash\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.155660 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-cni-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.155660 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-cni-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.155660 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee930469-602e-4383-9900-a97a25da678b-cni-binary-copy\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.156039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.156039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hskvc\" (UniqueName: \"kubernetes.io/projected/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-kube-api-access-hskvc\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.156039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvwmj\" (UniqueName: \"kubernetes.io/projected/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-kube-api-access-zvwmj\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.156039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-kubelet\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.156039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-host\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.156039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd77j\" (UniqueName: \"kubernetes.io/projected/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-kube-api-access-vd77j\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.156039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.155998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4edf430-4780-4b0d-b495-50534d4ddccc-agent-certs\") pod \"konnectivity-agent-cznmn\" (UID: \"f4edf430-4780-4b0d-b495-50534d4ddccc\") " pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppnzc\" (UniqueName: \"kubernetes.io/projected/280aa335-840b-490c-a36f-0cdef337ab79-kube-api-access-ppnzc\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfvz6\" (UniqueName: \"kubernetes.io/projected/957d9773-bf39-486e-a32e-eba60e7b49e9-kube-api-access-lfvz6\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-serviceca\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-ovn\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-cni-bin\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-cni-netd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovnkube-script-lib\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4edf430-4780-4b0d-b495-50534d4ddccc-konnectivity-ca\") pod \"konnectivity-agent-cznmn\" (UID: \"f4edf430-4780-4b0d-b495-50534d4ddccc\") " pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:02.156345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-cnibin\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.156727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-kubelet\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.156727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-run-netns\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.156727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.156727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovnkube-config\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.156727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-conf-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.156727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-sys-fs\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.156727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-systemd\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.157046 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-iptables-alerter-script\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.157046 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-system-cni-dir\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.157046 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-os-release\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.157046 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.156863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-lib-modules\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.157046 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-lib-modules\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.157270 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-tmp\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.157270 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.157270 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-kubelet\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.157662 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-run-netns\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.157662 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.157861 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-serviceca\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.157937 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-ovn\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.157937 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee930469-602e-4383-9900-a97a25da678b-cni-binary-copy\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.158036 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-cni-bin\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.158036 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.157990 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-cni-netd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.158137 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovnkube-config\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.158192 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-systemd\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.158243 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-conf-dir\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.158301 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-sys-fs\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.158363 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-host\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.158412 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-host-kubelet\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.158578 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovnkube-script-lib\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.158663 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-cnibin\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.158776 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-socket-dir-parent\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.158829 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zrh6\" (UniqueName: \"kubernetes.io/projected/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-kube-api-access-8zrh6\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.158883 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-etc-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.158883 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-cnibin\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.159002 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-cni-bin\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.159002 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4edf430-4780-4b0d-b495-50534d4ddccc-konnectivity-ca\") pod \"konnectivity-agent-cznmn\" (UID: \"f4edf430-4780-4b0d-b495-50534d4ddccc\") " pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:02.159002 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-socket-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.159002 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-sys\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.159002 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.158990 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-system-cni-dir\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.159226 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-multus-socket-dir-parent\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.159276 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-etc-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.159322 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/280aa335-840b-490c-a36f-0cdef337ab79-cnibin\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.159384 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-host-var-lib-cni-bin\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.159478 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee930469-602e-4383-9900-a97a25da678b-os-release\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.159527 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.159572 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-node-log\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.159622 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ee930469-602e-4383-9900-a97a25da678b-multus-daemon-config\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.159622 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-device-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.159755 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysctl-d\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.160001 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-sys\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.160067 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.160045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-run-openvswitch\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.160127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.160092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-node-log\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.160199 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.160179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-socket-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.160652 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.160626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ee930469-602e-4383-9900-a97a25da678b-multus-daemon-config\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.161506 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.159683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-tuned\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.161506 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.160774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-device-dir\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.161506 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.160867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-sysctl-d\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.161506 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.161003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-ovn-node-metrics-cert\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.161506 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.161091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-iptables-alerter-script\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.162570 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.162508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4edf430-4780-4b0d-b495-50534d4ddccc-agent-certs\") pod \"konnectivity-agent-cznmn\" (UID: \"f4edf430-4780-4b0d-b495-50534d4ddccc\") " pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:02.163243 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.163224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-etc-tuned\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.163480 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.163457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8jsd\" (UniqueName: \"kubernetes.io/projected/fd0e1c46-4f51-455c-8267-abe0b6eacfd9-kube-api-access-f8jsd\") pod \"ovnkube-node-w57k8\" (UID: \"fd0e1c46-4f51-455c-8267-abe0b6eacfd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.163732 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.163710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zs5q\" (UniqueName: \"kubernetes.io/projected/ee930469-602e-4383-9900-a97a25da678b-kube-api-access-9zs5q\") pod \"multus-jxhs2\" (UID: \"ee930469-602e-4383-9900-a97a25da678b\") " pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.166042 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.166021 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:02.166119 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.166047 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:02.166119 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.166061 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x7kcl for pod openshift-network-diagnostics/network-check-target-wrvx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:02.166194 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.166138 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl podName:7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:02.666117185 +0000 UTC m=+3.088782485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x7kcl" (UniqueName: "kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl") pod "network-check-target-wrvx6" (UID: "7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:02.169720 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.169676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hskvc\" (UniqueName: \"kubernetes.io/projected/86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6-kube-api-access-hskvc\") pod \"aws-ebs-csi-driver-node-v9k7h\" (UID: \"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.170173 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.170124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvwmj\" (UniqueName: \"kubernetes.io/projected/be8c9b4d-d8b6-438d-adfb-b1521f3c0d84-kube-api-access-zvwmj\") pod \"iptables-alerter-qpgmr\" (UID: \"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84\") " pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.171821 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.171785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfvz6\" (UniqueName: \"kubernetes.io/projected/957d9773-bf39-486e-a32e-eba60e7b49e9-kube-api-access-lfvz6\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:02.172386 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.172362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppnzc\" (UniqueName: \"kubernetes.io/projected/280aa335-840b-490c-a36f-0cdef337ab79-kube-api-access-ppnzc\") pod \"multus-additional-cni-plugins-2hbdj\" (UID: \"280aa335-840b-490c-a36f-0cdef337ab79\") " pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.172386 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.171912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zrh6\" (UniqueName: \"kubernetes.io/projected/7eb8b708-4ebf-4d5f-b8a0-ee69ff963778-kube-api-access-8zrh6\") pod \"node-ca-47nkl\" (UID: \"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778\") " pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.172560 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.172364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd77j\" (UniqueName: \"kubernetes.io/projected/6bbb5ca1-ed6e-4f68-9c57-47482245dcb1-kube-api-access-vd77j\") pod \"tuned-z4mnr\" (UID: \"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1\") " pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.340291 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.340192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" Apr 22 19:23:02.348022 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.347998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-47nkl" Apr 22 19:23:02.357673 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.357650 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qpgmr" Apr 22 19:23:02.364499 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.364478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:02.370830 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.370815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:02.378303 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.378283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" Apr 22 19:23:02.383824 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.383809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jxhs2" Apr 22 19:23:02.388567 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.388535 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" Apr 22 19:23:02.430263 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.430237 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:02.613441 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:02.613413 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd0e1c46_4f51_455c_8267_abe0b6eacfd9.slice/crio-2abb9fe6271683f74b6b2d070705dda87d84933769a02582e0741554c31a6d4a WatchSource:0}: Error finding container 2abb9fe6271683f74b6b2d070705dda87d84933769a02582e0741554c31a6d4a: Status 404 returned error can't find the container with id 2abb9fe6271683f74b6b2d070705dda87d84933769a02582e0741554c31a6d4a Apr 22 19:23:02.619113 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:02.619090 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eb8b708_4ebf_4d5f_b8a0_ee69ff963778.slice/crio-695b628c79515e55660fb9afd0c2d816735e78feace0f5802b12a7c13426f541 WatchSource:0}: Error finding container 695b628c79515e55660fb9afd0c2d816735e78feace0f5802b12a7c13426f541: Status 404 returned error can't find the container with id 695b628c79515e55660fb9afd0c2d816735e78feace0f5802b12a7c13426f541 Apr 22 19:23:02.623560 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:02.623361 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8c9b4d_d8b6_438d_adfb_b1521f3c0d84.slice/crio-9cc74bfe5a75027f7e013e0ed1d6e30571cde2b5391cf9af1eca57d6ae455a0a WatchSource:0}: Error finding container 9cc74bfe5a75027f7e013e0ed1d6e30571cde2b5391cf9af1eca57d6ae455a0a: Status 404 returned error can't find the container with id 9cc74bfe5a75027f7e013e0ed1d6e30571cde2b5391cf9af1eca57d6ae455a0a Apr 22 19:23:02.624080 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:02.624057 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee930469_602e_4383_9900_a97a25da678b.slice/crio-38fe41d1715d99425b1dd6584121cc83e9db16465babcd512ca12be57c5363b1 WatchSource:0}: Error finding container 38fe41d1715d99425b1dd6584121cc83e9db16465babcd512ca12be57c5363b1: Status 404 returned error can't find the container with id 38fe41d1715d99425b1dd6584121cc83e9db16465babcd512ca12be57c5363b1 Apr 22 19:23:02.624945 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:02.624902 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bbb5ca1_ed6e_4f68_9c57_47482245dcb1.slice/crio-de224c445d3f7fc5f27aec607d166f3ac19c9007f875ab1b8f094c2157f3531c WatchSource:0}: Error finding container de224c445d3f7fc5f27aec607d166f3ac19c9007f875ab1b8f094c2157f3531c: Status 404 returned error can't find the container with id de224c445d3f7fc5f27aec607d166f3ac19c9007f875ab1b8f094c2157f3531c Apr 22 19:23:02.626114 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:02.626091 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d7910d_c0b3_43fe_96ee_d4ca45e7f6c6.slice/crio-e5a44506afd48d264ece7920e5b9a6c09b60b0166cdec44d7242f2b7fa9a2f9c WatchSource:0}: Error finding container e5a44506afd48d264ece7920e5b9a6c09b60b0166cdec44d7242f2b7fa9a2f9c: Status 404 returned error can't find the container with id e5a44506afd48d264ece7920e5b9a6c09b60b0166cdec44d7242f2b7fa9a2f9c Apr 22 19:23:02.627444 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:02.627400 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280aa335_840b_490c_a36f_0cdef337ab79.slice/crio-e021d990faeccf13e8266c2299062e681e98d484f7c3edef87157d24332cef7a WatchSource:0}: Error finding container e021d990faeccf13e8266c2299062e681e98d484f7c3edef87157d24332cef7a: Status 404 returned error can't find the container with id e021d990faeccf13e8266c2299062e681e98d484f7c3edef87157d24332cef7a Apr 22 19:23:02.662801 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.662780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:02.662883 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.662872 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:02.662947 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.662934 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:03.662906037 +0000 UTC m=+4.085571321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:02.764032 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:02.764002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:02.764163 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.764120 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:02.764163 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.764134 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:02.764163 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.764143 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x7kcl for pod openshift-network-diagnostics/network-check-target-wrvx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:02.764295 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:02.764193 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl podName:7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:03.76417781 +0000 UTC m=+4.186843096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7kcl" (UniqueName: "kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl") pod "network-check-target-wrvx6" (UID: "7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:03.148903 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.148784 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:01 +0000 UTC" deadline="2027-10-25 04:57:58.452413464 +0000 UTC" Apr 22 19:23:03.148903 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.148824 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13209h34m55.303593383s" Apr 22 19:23:03.150878 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.150852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:03.151424 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:03.151025 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:03.167418 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.166553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" event={"ID":"e477e0c5a4e100c99b94e03d47d9bc3f","Type":"ContainerStarted","Data":"0ae2108d12003d23a6d6507299bcd2e5b9c15ab481b5cda03e3848ab6807852b"} Apr 22 19:23:03.169101 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.169041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" event={"ID":"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6","Type":"ContainerStarted","Data":"e5a44506afd48d264ece7920e5b9a6c09b60b0166cdec44d7242f2b7fa9a2f9c"} Apr 22 19:23:03.174304 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.174257 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" event={"ID":"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1","Type":"ContainerStarted","Data":"de224c445d3f7fc5f27aec607d166f3ac19c9007f875ab1b8f094c2157f3531c"} Apr 22 19:23:03.179676 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.179547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qpgmr" event={"ID":"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84","Type":"ContainerStarted","Data":"9cc74bfe5a75027f7e013e0ed1d6e30571cde2b5391cf9af1eca57d6ae455a0a"} Apr 22 19:23:03.185746 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.185687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"2abb9fe6271683f74b6b2d070705dda87d84933769a02582e0741554c31a6d4a"} Apr 22 19:23:03.190170 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.190115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" event={"ID":"280aa335-840b-490c-a36f-0cdef337ab79","Type":"ContainerStarted","Data":"e021d990faeccf13e8266c2299062e681e98d484f7c3edef87157d24332cef7a"} Apr 22 19:23:03.193942 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.193897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jxhs2" event={"ID":"ee930469-602e-4383-9900-a97a25da678b","Type":"ContainerStarted","Data":"38fe41d1715d99425b1dd6584121cc83e9db16465babcd512ca12be57c5363b1"} Apr 22 19:23:03.197170 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.197127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cznmn" event={"ID":"f4edf430-4780-4b0d-b495-50534d4ddccc","Type":"ContainerStarted","Data":"188641f111be0220f33a666d9fe0c2110b08389685711270458b4e5b03051e2d"} Apr 22 19:23:03.199988 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.199931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-47nkl" event={"ID":"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778","Type":"ContainerStarted","Data":"695b628c79515e55660fb9afd0c2d816735e78feace0f5802b12a7c13426f541"} Apr 22 19:23:03.672215 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.672178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:03.672405 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:03.672387 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:03.672486 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:03.672466 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:05.672439423 +0000 UTC m=+6.095104723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:03.773184 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:03.773132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:03.773361 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:03.773326 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:03.773361 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:03.773349 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:03.773361 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:03.773363 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x7kcl for pod openshift-network-diagnostics/network-check-target-wrvx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:03.773530 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:03.773421 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl podName:7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:05.77340214 +0000 UTC m=+6.196067431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7kcl" (UniqueName: "kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl") pod "network-check-target-wrvx6" (UID: "7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:04.153946 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.153302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:04.153946 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:04.153432 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:04.211945 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.211822 2576 generic.go:358] "Generic (PLEG): container finished" podID="99b983d6e870652c0db1327874b8eda4" containerID="37e93d50be2e2ec4d82ed1e944363432b92514f2c97cc2d83df33c245164c1e1" exitCode=0 Apr 22 19:23:04.212728 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.212178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" event={"ID":"99b983d6e870652c0db1327874b8eda4","Type":"ContainerDied","Data":"37e93d50be2e2ec4d82ed1e944363432b92514f2c97cc2d83df33c245164c1e1"} Apr 22 19:23:04.228915 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.228864 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-15.ec2.internal" podStartSLOduration=3.228845598 podStartE2EDuration="3.228845598s" podCreationTimestamp="2026-04-22 19:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:03.182154394 +0000 UTC m=+3.604819697" watchObservedRunningTime="2026-04-22 19:23:04.228845598 +0000 UTC m=+4.651510905" Apr 22 19:23:04.664948 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.664653 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hftfk"] Apr 22 19:23:04.667771 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.666907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.672075 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.670633 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b9xkx\"" Apr 22 19:23:04.672075 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.670877 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:04.672075 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.671396 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:04.781179 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.781096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7gjm\" (UniqueName: \"kubernetes.io/projected/4caf5855-872c-4886-aec4-eb966cfeb4c3-kube-api-access-p7gjm\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.781346 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.781179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4caf5855-872c-4886-aec4-eb966cfeb4c3-hosts-file\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.781346 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.781224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4caf5855-872c-4886-aec4-eb966cfeb4c3-tmp-dir\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.881814 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.881780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7gjm\" (UniqueName: \"kubernetes.io/projected/4caf5855-872c-4886-aec4-eb966cfeb4c3-kube-api-access-p7gjm\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.881990 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.881852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4caf5855-872c-4886-aec4-eb966cfeb4c3-hosts-file\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.881990 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.881896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4caf5855-872c-4886-aec4-eb966cfeb4c3-tmp-dir\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.882327 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.882304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4caf5855-872c-4886-aec4-eb966cfeb4c3-hosts-file\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.882743 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.882720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4caf5855-872c-4886-aec4-eb966cfeb4c3-tmp-dir\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.907668 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.904093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7gjm\" (UniqueName: \"kubernetes.io/projected/4caf5855-872c-4886-aec4-eb966cfeb4c3-kube-api-access-p7gjm\") pod \"node-resolver-hftfk\" (UID: \"4caf5855-872c-4886-aec4-eb966cfeb4c3\") " pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:04.982833 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:04.982299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hftfk" Apr 22 19:23:05.151098 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:05.150842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:05.151098 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:05.150978 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:05.217752 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:05.217687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" event={"ID":"99b983d6e870652c0db1327874b8eda4","Type":"ContainerStarted","Data":"04ceab3dfe634a71ccccbae10618475d1ac42d73a251e3e555d0681bb1450278"} Apr 22 19:23:05.232144 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:05.231773 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-15.ec2.internal" podStartSLOduration=4.231753023 podStartE2EDuration="4.231753023s" podCreationTimestamp="2026-04-22 19:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:05.230931227 +0000 UTC m=+5.653596529" watchObservedRunningTime="2026-04-22 19:23:05.231753023 +0000 UTC m=+5.654418333" Apr 22 19:23:05.687987 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:05.687565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:05.687987 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:05.687727 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:05.687987 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:05.687784 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:09.687767668 +0000 UTC m=+10.110432960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:05.788237 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:05.788204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:05.788405 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:05.788388 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:05.788485 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:05.788412 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:05.788485 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:05.788425 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x7kcl for pod openshift-network-diagnostics/network-check-target-wrvx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:05.788485 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:05.788482 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl podName:7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:09.788464537 +0000 UTC m=+10.211129825 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7kcl" (UniqueName: "kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl") pod "network-check-target-wrvx6" (UID: "7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:06.153975 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:06.153842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:06.154660 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:06.154316 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:07.150765 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:07.150730 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:07.151224 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:07.150860 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:08.151627 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:08.151505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:08.152098 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:08.151647 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:09.151654 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:09.151124 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:09.151654 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:09.151257 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:09.725232 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:09.725144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:09.725404 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:09.725296 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:09.725404 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:09.725374 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:17.725352502 +0000 UTC m=+18.148017787 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:09.825708 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:09.825668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:09.825905 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:09.825886 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:09.825905 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:09.825906 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:09.826074 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:09.825935 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x7kcl for pod openshift-network-diagnostics/network-check-target-wrvx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:09.826074 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:09.826057 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl podName:7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:17.826033771 +0000 UTC m=+18.248699069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7kcl" (UniqueName: "kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl") pod "network-check-target-wrvx6" (UID: "7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:10.152806 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:10.152767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:10.153284 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:10.152891 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:11.151122 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:11.151088 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:11.151377 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:11.151214 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:12.151198 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:12.151160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:12.151651 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:12.151292 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:13.151345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:13.151313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:13.151731 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:13.151438 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:14.151751 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:14.151701 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:14.152210 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:14.151839 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:15.150707 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:15.150674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:15.150882 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:15.150774 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:16.151696 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:16.151653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:16.152174 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:16.151799 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:17.151561 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:17.151519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:17.151787 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:17.151650 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:17.782142 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:17.782104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:17.782322 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:17.782261 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:17.782373 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:17.782341 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.782321887 +0000 UTC m=+34.204987180 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:17.882892 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:17.882856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:17.883066 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:17.883038 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:17.883066 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:17.883059 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:17.883143 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:17.883073 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x7kcl for pod openshift-network-diagnostics/network-check-target-wrvx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:17.883143 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:17.883127 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl podName:7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.883114235 +0000 UTC m=+34.305779521 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7kcl" (UniqueName: "kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl") pod "network-check-target-wrvx6" (UID: "7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:18.013854 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.013813 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mbwdb"] Apr 22 19:23:18.031786 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.031763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.031953 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:18.031838 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:18.085119 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.085017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-dbus\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.085280 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.085134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-kubelet-config\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.085280 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.085170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.151546 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.151508 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:18.151717 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:18.151652 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:18.185940 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.185875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-kubelet-config\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.186394 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.185968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.186394 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.186011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-kubelet-config\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.186394 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.186015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-dbus\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.186394 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:18.186122 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:18.186394 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.186149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-dbus\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.186394 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:18.186184 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret podName:60fd0bdf-71f1-4c96-a444-0de0f50d1c60 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:18.686165082 +0000 UTC m=+19.108830370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret") pod "global-pull-secret-syncer-mbwdb" (UID: "60fd0bdf-71f1-4c96-a444-0de0f50d1c60") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:18.689726 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:18.689685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:18.689956 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:18.689818 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:18.689956 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:18.689901 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret podName:60fd0bdf-71f1-4c96-a444-0de0f50d1c60 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:19.689880507 +0000 UTC m=+20.112545792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret") pod "global-pull-secret-syncer-mbwdb" (UID: "60fd0bdf-71f1-4c96-a444-0de0f50d1c60") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:19.151367 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:19.151337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:19.151499 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:19.151381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:19.151499 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:19.151464 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:19.151606 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:19.151534 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:19.203674 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:23:19.203646 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4caf5855_872c_4886_aec4_eb966cfeb4c3.slice/crio-ab7719acdfbaa61ffe2357d3a6e435a8e298d68157c905e497a21f9f6bbd98d9 WatchSource:0}: Error finding container ab7719acdfbaa61ffe2357d3a6e435a8e298d68157c905e497a21f9f6bbd98d9: Status 404 returned error can't find the container with id ab7719acdfbaa61ffe2357d3a6e435a8e298d68157c905e497a21f9f6bbd98d9 Apr 22 19:23:19.243305 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:19.243268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hftfk" event={"ID":"4caf5855-872c-4886-aec4-eb966cfeb4c3","Type":"ContainerStarted","Data":"ab7719acdfbaa61ffe2357d3a6e435a8e298d68157c905e497a21f9f6bbd98d9"} Apr 22 19:23:19.698546 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:19.698294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:19.698658 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:19.698381 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:19.698710 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:19.698689 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret podName:60fd0bdf-71f1-4c96-a444-0de0f50d1c60 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:21.698667807 +0000 UTC m=+22.121333109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret") pod "global-pull-secret-syncer-mbwdb" (UID: "60fd0bdf-71f1-4c96-a444-0de0f50d1c60") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:20.151396 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.151364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:20.151572 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:20.151446 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:20.245653 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.245624 2576 generic.go:358] "Generic (PLEG): container finished" podID="280aa335-840b-490c-a36f-0cdef337ab79" containerID="30a7146a6fbaa9df83c888b57d64a474dad2354e75028ecc9d206c1f08648967" exitCode=0 Apr 22 19:23:20.246292 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.245690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" event={"ID":"280aa335-840b-490c-a36f-0cdef337ab79","Type":"ContainerDied","Data":"30a7146a6fbaa9df83c888b57d64a474dad2354e75028ecc9d206c1f08648967"} Apr 22 19:23:20.248143 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.248118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jxhs2" event={"ID":"ee930469-602e-4383-9900-a97a25da678b","Type":"ContainerStarted","Data":"9ac5a59c647f2d4c556a3da6f55bebe4857fc16ee484ae3a617e2504a5f50089"} Apr 22 19:23:20.249734 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.249710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cznmn" event={"ID":"f4edf430-4780-4b0d-b495-50534d4ddccc","Type":"ContainerStarted","Data":"50d83f421f36d5e74e44c3c804cdb50c1d325bb01be92ef8d74ab80b36bf34ab"} Apr 22 19:23:20.251300 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.251274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-47nkl" event={"ID":"7eb8b708-4ebf-4d5f-b8a0-ee69ff963778","Type":"ContainerStarted","Data":"4f178b1e62c2032acbc2caafb6565bc13a90e8696f875814d2b493e136728da3"} Apr 22 19:23:20.252589 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.252572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hftfk" event={"ID":"4caf5855-872c-4886-aec4-eb966cfeb4c3","Type":"ContainerStarted","Data":"b1072a1781d492d36028e4868f8a9b3ed53326820b01136031098278098c0dcc"} Apr 22 19:23:20.254563 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.254540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" event={"ID":"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6","Type":"ContainerStarted","Data":"191994d9255ec3757a0b4d3b72e43780d0028a9b4a3eeef5efaf422c16a23c77"} Apr 22 19:23:20.255698 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.255677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" event={"ID":"6bbb5ca1-ed6e-4f68-9c57-47482245dcb1","Type":"ContainerStarted","Data":"c5e6d1af2f13d759605de855790bc29a229f4c65382a947750c107cc05109c2d"} Apr 22 19:23:20.258170 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.258153 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:23:20.258423 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.258406 2576 generic.go:358] "Generic (PLEG): container finished" podID="fd0e1c46-4f51-455c-8267-abe0b6eacfd9" containerID="36374b852bb841b0b4cfc5bd1a69bbab4819f36f6576c7f31744d50d154b3e21" exitCode=1 Apr 22 19:23:20.258490 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.258431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"efcc43e55662d5b55bf762d16345c67fb389a4baf9bfa2c52f2149a592938c51"} Apr 22 19:23:20.258490 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.258446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"aa386cd3f02818b5ed4cec4aa567c7261c10b1a15ade35d8c11acd568c7df589"} Apr 22 19:23:20.258490 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.258455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"391afbe4868f1a83eb6b79afaa2c0826281e46df5830adafe4438be30d0dab33"} Apr 22 19:23:20.258490 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.258466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"d1b864fe2abfa7ac13a38a53f6d88386e06284fa4702db6fffb41db7d256057a"} Apr 22 19:23:20.258490 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.258477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerDied","Data":"36374b852bb841b0b4cfc5bd1a69bbab4819f36f6576c7f31744d50d154b3e21"} Apr 22 19:23:20.258490 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.258487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"0af1b96695df6ae970ce901ce962d511f710af508e5ed31f9022ce1201b53d48"} Apr 22 19:23:20.279161 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.279119 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z4mnr" podStartSLOduration=3.6944641049999998 podStartE2EDuration="20.279108268s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:23:02.626841314 +0000 UTC m=+3.049506610" lastFinishedPulling="2026-04-22 19:23:19.211485477 +0000 UTC m=+19.634150773" observedRunningTime="2026-04-22 19:23:20.278830437 +0000 UTC m=+20.701495743" watchObservedRunningTime="2026-04-22 19:23:20.279108268 +0000 UTC m=+20.701773574" Apr 22 19:23:20.291415 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.291323 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-47nkl" podStartSLOduration=8.199664151 podStartE2EDuration="20.291308103s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:23:02.621107034 +0000 UTC m=+3.043772328" lastFinishedPulling="2026-04-22 19:23:14.712750986 +0000 UTC m=+15.135416280" observedRunningTime="2026-04-22 19:23:20.291185599 +0000 UTC m=+20.713850906" watchObservedRunningTime="2026-04-22 19:23:20.291308103 +0000 UTC m=+20.713973412" Apr 22 19:23:20.295451 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.295431 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:23:20.312102 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.312065 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cznmn" podStartSLOduration=3.7995707960000003 podStartE2EDuration="20.312050669s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:23:02.623911276 +0000 UTC m=+3.046576569" lastFinishedPulling="2026-04-22 19:23:19.136391153 +0000 UTC m=+19.559056442" observedRunningTime="2026-04-22 19:23:20.311726788 +0000 UTC m=+20.734392094" watchObservedRunningTime="2026-04-22 19:23:20.312050669 +0000 UTC m=+20.734715976" Apr 22 19:23:20.352070 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.352031 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jxhs2" podStartSLOduration=3.707103429 podStartE2EDuration="20.352018379s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:23:02.62560696 +0000 UTC m=+3.048272247" lastFinishedPulling="2026-04-22 19:23:19.270521911 +0000 UTC m=+19.693187197" observedRunningTime="2026-04-22 19:23:20.351047496 +0000 UTC m=+20.773712803" watchObservedRunningTime="2026-04-22 19:23:20.352018379 +0000 UTC m=+20.774683686" Apr 22 19:23:20.385323 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:20.385286 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hftfk" podStartSLOduration=16.385274381 podStartE2EDuration="16.385274381s" podCreationTimestamp="2026-04-22 19:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:20.385115458 +0000 UTC m=+20.807780775" watchObservedRunningTime="2026-04-22 19:23:20.385274381 +0000 UTC m=+20.807939687" Apr 22 19:23:21.122668 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.122561 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:23:20.295446641Z","UUID":"370d7002-f7dd-4688-8765-27b885ef2154","Handler":null,"Name":"","Endpoint":""} Apr 22 19:23:21.124411 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.124386 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:23:21.124537 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.124418 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:23:21.151010 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.150985 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:21.151105 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.150985 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:21.151170 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:21.151103 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:21.151170 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:21.151162 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:21.264860 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.264823 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" event={"ID":"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6","Type":"ContainerStarted","Data":"8ed59922f856a0271944be790ed4bcc95a2a8db3aa1f643e33562e08c264cbef"} Apr 22 19:23:21.266531 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.266503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qpgmr" event={"ID":"be8c9b4d-d8b6-438d-adfb-b1521f3c0d84","Type":"ContainerStarted","Data":"9d313b5089fd72e29afcb6ced84ddbb3e7bee8f265d6b8b8d70a0713eee4065e"} Apr 22 19:23:21.280439 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.280330 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qpgmr" podStartSLOduration=4.699849966 podStartE2EDuration="21.280313588s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:23:02.624909447 +0000 UTC m=+3.047574735" lastFinishedPulling="2026-04-22 19:23:19.205373067 +0000 UTC m=+19.628038357" observedRunningTime="2026-04-22 19:23:21.279712818 +0000 UTC m=+21.702378125" watchObservedRunningTime="2026-04-22 19:23:21.280313588 +0000 UTC m=+21.702978897" Apr 22 19:23:21.715966 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:21.715915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:21.716207 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:21.716057 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:21.716207 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:21.716131 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret podName:60fd0bdf-71f1-4c96-a444-0de0f50d1c60 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:25.716116787 +0000 UTC m=+26.138782076 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret") pod "global-pull-secret-syncer-mbwdb" (UID: "60fd0bdf-71f1-4c96-a444-0de0f50d1c60") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:22.152049 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:22.151796 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:22.152249 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:22.152208 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:22.270216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:22.270177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" event={"ID":"86d7910d-c0b3-43fe-96ee-d4ca45e7f6c6","Type":"ContainerStarted","Data":"037f8368536e515a8503d855b8739a8540f3c2e13c94e6966ff6beb51184e01e"} Apr 22 19:23:22.272897 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:22.272874 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:23:22.273291 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:22.273239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"ded04524799459c31ff12689d21b920d96045c9b071ecb02576c7f8b28b2fdb0"} Apr 22 19:23:22.301054 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:22.301004 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9k7h" podStartSLOduration=3.795785254 podStartE2EDuration="22.300990475s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:23:02.628263236 +0000 UTC m=+3.050928524" lastFinishedPulling="2026-04-22 19:23:21.133468457 +0000 UTC m=+21.556133745" observedRunningTime="2026-04-22 19:23:22.300185044 +0000 UTC m=+22.722850350" watchObservedRunningTime="2026-04-22 19:23:22.300990475 +0000 UTC m=+22.723655782" Apr 22 19:23:23.150974 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:23.150914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:23.151190 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:23.150915 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:23.151190 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:23.151063 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:23.151190 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:23.151144 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:24.151234 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:24.151145 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:24.151848 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:24.151285 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:24.496127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:24.496095 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:24.496966 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:24.496942 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:25.150703 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.150671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:25.150898 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.150671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:25.150898 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:25.150768 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:25.150898 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:25.150861 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:25.281249 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.281219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:23:25.281881 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.281522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"026ee8670d35295980d28e68810e9ebacf0ac28b44b488fa84efc66e495fb381"} Apr 22 19:23:25.281881 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.281837 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:25.282042 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.281939 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:25.282042 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.281956 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:25.282135 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.282100 2576 scope.go:117] "RemoveContainer" containerID="36374b852bb841b0b4cfc5bd1a69bbab4819f36f6576c7f31744d50d154b3e21" Apr 22 19:23:25.283282 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.283259 2576 generic.go:358] "Generic (PLEG): container finished" podID="280aa335-840b-490c-a36f-0cdef337ab79" containerID="a08ae5cb5b863033e9bda336bdb9bd36b94490a80305261bc5f6e6d2d92e84d5" exitCode=0 Apr 22 19:23:25.283381 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.283342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" event={"ID":"280aa335-840b-490c-a36f-0cdef337ab79","Type":"ContainerDied","Data":"a08ae5cb5b863033e9bda336bdb9bd36b94490a80305261bc5f6e6d2d92e84d5"} Apr 22 19:23:25.283527 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.283501 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:25.284056 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.284036 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cznmn" Apr 22 19:23:25.297411 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.297390 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:25.297582 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.297569 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:23:25.744137 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:25.744098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:25.744357 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:25.744306 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:25.744435 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:25.744388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret podName:60fd0bdf-71f1-4c96-a444-0de0f50d1c60 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.744365697 +0000 UTC m=+34.167030999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret") pod "global-pull-secret-syncer-mbwdb" (UID: "60fd0bdf-71f1-4c96-a444-0de0f50d1c60") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:26.151308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.151285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:26.151409 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:26.151390 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:26.286519 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.286421 2576 generic.go:358] "Generic (PLEG): container finished" podID="280aa335-840b-490c-a36f-0cdef337ab79" containerID="85d4a88cf335fe584ae16529bc0f0900a2b9c997da2fa18b625435e71511e1a1" exitCode=0 Apr 22 19:23:26.286519 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.286464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" event={"ID":"280aa335-840b-490c-a36f-0cdef337ab79","Type":"ContainerDied","Data":"85d4a88cf335fe584ae16529bc0f0900a2b9c997da2fa18b625435e71511e1a1"} Apr 22 19:23:26.290105 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.290086 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:23:26.290392 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.290371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" event={"ID":"fd0e1c46-4f51-455c-8267-abe0b6eacfd9","Type":"ContainerStarted","Data":"db77b6a96dab6d2413494a6f03f50449ffd8693bdc4f4cb29312be9b5556a9fe"} Apr 22 19:23:26.343629 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.342476 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" podStartSLOduration=9.684223702 podStartE2EDuration="26.342457211s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:23:02.616292036 +0000 UTC m=+3.038957332" lastFinishedPulling="2026-04-22 19:23:19.274525551 +0000 UTC m=+19.697190841" observedRunningTime="2026-04-22 19:23:26.341445787 +0000 UTC m=+26.764111096" watchObservedRunningTime="2026-04-22 19:23:26.342457211 +0000 UTC m=+26.765122518" Apr 22 19:23:26.461448 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.461419 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m8fmk"] Apr 22 19:23:26.461611 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.461530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:26.461660 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:26.461625 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:26.463187 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.463162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mbwdb"] Apr 22 19:23:26.463294 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.463276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:26.463391 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:26.463371 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:26.485885 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.485839 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wrvx6"] Apr 22 19:23:26.486037 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:26.485971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:26.486103 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:26.486050 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:27.294425 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:27.294345 2576 generic.go:358] "Generic (PLEG): container finished" podID="280aa335-840b-490c-a36f-0cdef337ab79" containerID="0a5a941cf77d57366e33a290e688d92a3493e98e076711b056ec6809418042d0" exitCode=0 Apr 22 19:23:27.294800 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:27.294411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" event={"ID":"280aa335-840b-490c-a36f-0cdef337ab79","Type":"ContainerDied","Data":"0a5a941cf77d57366e33a290e688d92a3493e98e076711b056ec6809418042d0"} Apr 22 19:23:28.150771 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:28.150735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:28.150956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:28.150735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:28.150956 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:28.150883 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:28.150956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:28.150735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:28.151098 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:28.150958 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:28.151098 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:28.151015 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:30.152276 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:30.152042 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:30.152839 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:30.152107 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:30.152839 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:30.152384 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:30.152839 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:30.152136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:30.152839 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:30.152513 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:30.152839 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:30.152632 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:32.151820 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.151734 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:32.152403 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.151743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:32.152403 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.151865 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:23:32.152403 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.151884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:32.152403 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.151999 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mbwdb" podUID="60fd0bdf-71f1-4c96-a444-0de0f50d1c60" Apr 22 19:23:32.152403 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.152088 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrvx6" podUID="7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb" Apr 22 19:23:32.417631 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.417563 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-15.ec2.internal" event="NodeReady" Apr 22 19:23:32.417763 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.417680 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:23:32.455624 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.455572 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d578bf96b-fdww5"] Apr 22 19:23:32.487540 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.487438 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9f56t"] Apr 22 19:23:32.487701 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.487615 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.491135 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.491111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:23:32.491515 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.491371 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:23:32.491515 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.491397 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:23:32.492111 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.492088 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2jtv7\"" Apr 22 19:23:32.498143 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.497713 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:23:32.507736 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.507712 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9f56t"] Apr 22 19:23:32.507873 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.507753 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n2sn2"] Apr 22 19:23:32.507873 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.507761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:32.512454 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.512435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5ns48\"" Apr 22 19:23:32.512629 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.512611 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 19:23:32.512712 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.512660 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 19:23:32.520413 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.520392 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d578bf96b-fdww5"] Apr 22 19:23:32.520413 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.520416 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vk4gw"] Apr 22 19:23:32.520592 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.520565 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.523439 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.523376 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:23:32.523874 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.523856 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:23:32.523974 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.523884 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8zlb\"" Apr 22 19:23:32.535191 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.535171 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n2sn2"] Apr 22 19:23:32.535296 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.535195 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vk4gw"] Apr 22 19:23:32.535346 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.535298 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:32.538318 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.538296 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:23:32.538875 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.538853 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8fgp\"" Apr 22 19:23:32.542625 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.539699 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:23:32.542625 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.539802 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:23:32.604162 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-installation-pull-secrets\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.604408 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.604408 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:32.604408 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-bound-sa-token\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.604408 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-trusted-ca\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.604408 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-image-registry-private-configuration\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.604408 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-certificates\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.604408 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/130af29b-4cd5-410b-b95b-ed57b79c76d2-ca-trust-extracted\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.604651 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bcb\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-kube-api-access-49bcb\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.604651 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.604437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/18fa20f4-e79f-4f01-9142-38e98b2350d6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:32.705538 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49bcb\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-kube-api-access-49bcb\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.705538 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/18fa20f4-e79f-4f01-9142-38e98b2350d6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:32.705761 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-installation-pull-secrets\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.705761 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0674059-fa3f-4411-bea0-5b58dca69acc-config-volume\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.705761 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.705761 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:32.705995 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:32.705995 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-bound-sa-token\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.705995 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0674059-fa3f-4411-bea0-5b58dca69acc-tmp-dir\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.705995 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.705893 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:32.705995 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-trusted-ca\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.705995 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.705956 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:32.705995 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.705970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.705995 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.705976 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d578bf96b-fdww5: secret "image-registry-tls" not found Apr 22 19:23:32.706309 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.706001 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.205979915 +0000 UTC m=+33.628645213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:23:32.706309 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.706048 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls podName:130af29b-4cd5-410b-b95b-ed57b79c76d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.206038444 +0000 UTC m=+33.628703734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls") pod "image-registry-5d578bf96b-fdww5" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2") : secret "image-registry-tls" not found Apr 22 19:23:32.706309 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsh6c\" (UniqueName: \"kubernetes.io/projected/d0674059-fa3f-4411-bea0-5b58dca69acc-kube-api-access-vsh6c\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.706309 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmkv\" (UniqueName: \"kubernetes.io/projected/0094a2f2-1687-4429-a919-c7f5d7498255-kube-api-access-kvmkv\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:32.706309 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-image-registry-private-configuration\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.706309 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-certificates\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.706309 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/130af29b-4cd5-410b-b95b-ed57b79c76d2-ca-trust-extracted\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.706671 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/18fa20f4-e79f-4f01-9142-38e98b2350d6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:32.706671 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/130af29b-4cd5-410b-b95b-ed57b79c76d2-ca-trust-extracted\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.706773 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-certificates\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.706827 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.706805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-trusted-ca\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.709845 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.709824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-installation-pull-secrets\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.709845 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.709835 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-image-registry-private-configuration\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.714652 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.714631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49bcb\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-kube-api-access-49bcb\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.714807 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.714787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-bound-sa-token\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:32.806854 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.806819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0674059-fa3f-4411-bea0-5b58dca69acc-config-volume\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.807075 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.807026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:32.807140 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.807077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0674059-fa3f-4411-bea0-5b58dca69acc-tmp-dir\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.807140 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.807118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.807239 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.807141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsh6c\" (UniqueName: \"kubernetes.io/projected/d0674059-fa3f-4411-bea0-5b58dca69acc-kube-api-access-vsh6c\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.807239 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.807159 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:32.807403 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.807241 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert podName:0094a2f2-1687-4429-a919-c7f5d7498255 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.307220275 +0000 UTC m=+33.729885561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert") pod "ingress-canary-vk4gw" (UID: "0094a2f2-1687-4429-a919-c7f5d7498255") : secret "canary-serving-cert" not found Apr 22 19:23:32.807403 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.807163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvmkv\" (UniqueName: \"kubernetes.io/projected/0094a2f2-1687-4429-a919-c7f5d7498255-kube-api-access-kvmkv\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:32.807403 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.807361 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:32.807565 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:32.807413 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls podName:d0674059-fa3f-4411-bea0-5b58dca69acc nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.307395248 +0000 UTC m=+33.730060548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls") pod "dns-default-n2sn2" (UID: "d0674059-fa3f-4411-bea0-5b58dca69acc") : secret "dns-default-metrics-tls" not found Apr 22 19:23:32.807565 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.807414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0674059-fa3f-4411-bea0-5b58dca69acc-tmp-dir\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.811841 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.811818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0674059-fa3f-4411-bea0-5b58dca69acc-config-volume\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:32.817736 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.817701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvmkv\" (UniqueName: \"kubernetes.io/projected/0094a2f2-1687-4429-a919-c7f5d7498255-kube-api-access-kvmkv\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:32.818194 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:32.818173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsh6c\" (UniqueName: \"kubernetes.io/projected/d0674059-fa3f-4411-bea0-5b58dca69acc-kube-api-access-vsh6c\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:33.210505 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:33.210468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:33.210505 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:33.210508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:33.211223 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.210624 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:33.211223 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.210647 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d578bf96b-fdww5: secret "image-registry-tls" not found Apr 22 19:23:33.211223 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.210709 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls podName:130af29b-4cd5-410b-b95b-ed57b79c76d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.210692239 +0000 UTC m=+34.633357552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls") pod "image-registry-5d578bf96b-fdww5" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2") : secret "image-registry-tls" not found Apr 22 19:23:33.211223 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.210720 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:33.211223 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.210770 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.210758795 +0000 UTC m=+34.633424086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:23:33.311554 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:33.311519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:33.311737 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:33.311569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:33.311737 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.311658 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:33.311737 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.311697 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:33.311737 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.311723 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert podName:0094a2f2-1687-4429-a919-c7f5d7498255 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.31170825 +0000 UTC m=+34.734373535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert") pod "ingress-canary-vk4gw" (UID: "0094a2f2-1687-4429-a919-c7f5d7498255") : secret "canary-serving-cert" not found Apr 22 19:23:33.311905 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.311744 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls podName:d0674059-fa3f-4411-bea0-5b58dca69acc nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.311729843 +0000 UTC m=+34.734395142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls") pod "dns-default-n2sn2" (UID: "d0674059-fa3f-4411-bea0-5b58dca69acc") : secret "dns-default-metrics-tls" not found Apr 22 19:23:33.815675 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:33.815632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:33.815860 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:33.815766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:33.815860 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.815804 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:33.816005 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.815870 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:33.816005 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.815884 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:05.815863579 +0000 UTC m=+66.238528869 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:33.816005 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.815947 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret podName:60fd0bdf-71f1-4c96-a444-0de0f50d1c60 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:49.815915822 +0000 UTC m=+50.238581108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret") pod "global-pull-secret-syncer-mbwdb" (UID: "60fd0bdf-71f1-4c96-a444-0de0f50d1c60") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:33.916438 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:33.916398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:33.916633 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.916560 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:33.916633 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.916576 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:33.916633 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.916585 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x7kcl for pod openshift-network-diagnostics/network-check-target-wrvx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:33.916798 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:33.916647 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl podName:7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:05.916630361 +0000 UTC m=+66.339295649 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7kcl" (UniqueName: "kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl") pod "network-check-target-wrvx6" (UID: "7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:34.151727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.151693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:23:34.151892 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.151693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:23:34.151892 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.151693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:34.155023 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.155001 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:23:34.156214 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.156191 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:23:34.156344 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.156245 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:23:34.156411 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.156341 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l6zl6\"" Apr 22 19:23:34.156411 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.156345 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:23:34.156508 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.156486 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kmhxv\"" Apr 22 19:23:34.219165 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.219118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:34.219610 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.219179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:34.219610 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.219293 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:34.219610 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.219318 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d578bf96b-fdww5: secret "image-registry-tls" not found Apr 22 19:23:34.219610 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.219349 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:34.219610 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.219408 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls podName:130af29b-4cd5-410b-b95b-ed57b79c76d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.219386486 +0000 UTC m=+36.642051788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls") pod "image-registry-5d578bf96b-fdww5" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2") : secret "image-registry-tls" not found Apr 22 19:23:34.219610 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.219427 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.219417644 +0000 UTC m=+36.642082934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:23:34.319688 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.319653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:34.319892 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:34.319788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:34.319892 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.319849 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:34.320031 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.319892 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:34.320031 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.319939 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls podName:d0674059-fa3f-4411-bea0-5b58dca69acc nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.319898744 +0000 UTC m=+36.742564043 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls") pod "dns-default-n2sn2" (UID: "d0674059-fa3f-4411-bea0-5b58dca69acc") : secret "dns-default-metrics-tls" not found Apr 22 19:23:34.320031 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:34.319964 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert podName:0094a2f2-1687-4429-a919-c7f5d7498255 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.319949285 +0000 UTC m=+36.742614809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert") pod "ingress-canary-vk4gw" (UID: "0094a2f2-1687-4429-a919-c7f5d7498255") : secret "canary-serving-cert" not found Apr 22 19:23:36.236119 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:36.235887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:36.236479 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:36.236141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:36.236479 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.236034 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:36.236479 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.236199 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d578bf96b-fdww5: secret "image-registry-tls" not found Apr 22 19:23:36.236479 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.236262 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls podName:130af29b-4cd5-410b-b95b-ed57b79c76d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:40.236243807 +0000 UTC m=+40.658909097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls") pod "image-registry-5d578bf96b-fdww5" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2") : secret "image-registry-tls" not found Apr 22 19:23:36.236479 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.236304 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:36.236479 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.236345 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:40.236332397 +0000 UTC m=+40.658997682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:23:36.317309 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:36.317276 2576 generic.go:358] "Generic (PLEG): container finished" podID="280aa335-840b-490c-a36f-0cdef337ab79" containerID="f093a59a71b4de0497a89ccf5d1ab61b9825c2c497877f4e0d06e9f74f2567db" exitCode=0 Apr 22 19:23:36.317452 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:36.317346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" event={"ID":"280aa335-840b-490c-a36f-0cdef337ab79","Type":"ContainerDied","Data":"f093a59a71b4de0497a89ccf5d1ab61b9825c2c497877f4e0d06e9f74f2567db"} Apr 22 19:23:36.337036 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:36.337002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:36.337152 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:36.337067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:36.337217 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.337206 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:36.337266 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.337206 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:36.337266 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.337261 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls podName:d0674059-fa3f-4411-bea0-5b58dca69acc nodeName:}" failed. No retries permitted until 2026-04-22 19:23:40.337243367 +0000 UTC m=+40.759908655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls") pod "dns-default-n2sn2" (UID: "d0674059-fa3f-4411-bea0-5b58dca69acc") : secret "dns-default-metrics-tls" not found Apr 22 19:23:36.337368 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:36.337290 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert podName:0094a2f2-1687-4429-a919-c7f5d7498255 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:40.337275903 +0000 UTC m=+40.759941187 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert") pod "ingress-canary-vk4gw" (UID: "0094a2f2-1687-4429-a919-c7f5d7498255") : secret "canary-serving-cert" not found Apr 22 19:23:37.321674 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:37.321638 2576 generic.go:358] "Generic (PLEG): container finished" podID="280aa335-840b-490c-a36f-0cdef337ab79" containerID="80bd325710d805a13f5b57678847b285c22bd07869d14592881a81157a4e6449" exitCode=0 Apr 22 19:23:37.321674 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:37.321678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" event={"ID":"280aa335-840b-490c-a36f-0cdef337ab79","Type":"ContainerDied","Data":"80bd325710d805a13f5b57678847b285c22bd07869d14592881a81157a4e6449"} Apr 22 19:23:38.326471 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:38.326438 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" event={"ID":"280aa335-840b-490c-a36f-0cdef337ab79","Type":"ContainerStarted","Data":"d47266b02fe2eb846b288e6f5bed8ec5481b59c49ed5a4da060ef2e22dd3a027"} Apr 22 19:23:38.351856 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:38.351801 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2hbdj" podStartSLOduration=5.161929776 podStartE2EDuration="38.351783982s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:23:02.629557143 +0000 UTC m=+3.052222436" lastFinishedPulling="2026-04-22 19:23:35.819411349 +0000 UTC m=+36.242076642" observedRunningTime="2026-04-22 19:23:38.350759353 +0000 UTC m=+38.773424673" watchObservedRunningTime="2026-04-22 19:23:38.351783982 +0000 UTC m=+38.774449290" Apr 22 19:23:40.267359 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:40.267320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:40.267359 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:40.267362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:40.267818 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.267478 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:40.267818 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.267485 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:40.267818 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.267507 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d578bf96b-fdww5: secret "image-registry-tls" not found Apr 22 19:23:40.267818 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.267525 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.267513294 +0000 UTC m=+48.690178579 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:23:40.267818 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.267560 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls podName:130af29b-4cd5-410b-b95b-ed57b79c76d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.267543792 +0000 UTC m=+48.690209081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls") pod "image-registry-5d578bf96b-fdww5" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2") : secret "image-registry-tls" not found Apr 22 19:23:40.368260 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:40.368230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:40.368433 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:40.368268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:40.368433 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.368355 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:40.368433 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.368374 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:40.368433 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.368400 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls podName:d0674059-fa3f-4411-bea0-5b58dca69acc nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.368388051 +0000 UTC m=+48.791053336 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls") pod "dns-default-n2sn2" (UID: "d0674059-fa3f-4411-bea0-5b58dca69acc") : secret "dns-default-metrics-tls" not found Apr 22 19:23:40.368433 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:40.368427 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert podName:0094a2f2-1687-4429-a919-c7f5d7498255 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.368412125 +0000 UTC m=+48.791077416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert") pod "ingress-canary-vk4gw" (UID: "0094a2f2-1687-4429-a919-c7f5d7498255") : secret "canary-serving-cert" not found Apr 22 19:23:48.328171 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:48.328126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:23:48.328614 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:48.328176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:23:48.328614 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.328272 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:48.328614 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.328293 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d578bf96b-fdww5: secret "image-registry-tls" not found Apr 22 19:23:48.328614 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.328321 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:48.328614 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.328347 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls podName:130af29b-4cd5-410b-b95b-ed57b79c76d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.328331298 +0000 UTC m=+64.750996589 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls") pod "image-registry-5d578bf96b-fdww5" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2") : secret "image-registry-tls" not found Apr 22 19:23:48.328614 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.328379 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.32836361 +0000 UTC m=+64.751028895 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:23:48.429001 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:48.428960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:23:48.429001 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:48.429009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:23:48.429258 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.429100 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:48.429258 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.429112 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:48.429258 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.429161 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert podName:0094a2f2-1687-4429-a919-c7f5d7498255 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.429146253 +0000 UTC m=+64.851811537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert") pod "ingress-canary-vk4gw" (UID: "0094a2f2-1687-4429-a919-c7f5d7498255") : secret "canary-serving-cert" not found Apr 22 19:23:48.429258 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:23:48.429177 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls podName:d0674059-fa3f-4411-bea0-5b58dca69acc nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.429170905 +0000 UTC m=+64.851836189 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls") pod "dns-default-n2sn2" (UID: "d0674059-fa3f-4411-bea0-5b58dca69acc") : secret "dns-default-metrics-tls" not found Apr 22 19:23:49.841781 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:49.841719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:49.844799 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:49.844776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/60fd0bdf-71f1-4c96-a444-0de0f50d1c60-original-pull-secret\") pod \"global-pull-secret-syncer-mbwdb\" (UID: \"60fd0bdf-71f1-4c96-a444-0de0f50d1c60\") " pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:50.075017 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:50.074981 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mbwdb" Apr 22 19:23:50.233224 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:50.233188 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mbwdb"] Apr 22 19:23:50.348964 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:50.348899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mbwdb" event={"ID":"60fd0bdf-71f1-4c96-a444-0de0f50d1c60","Type":"ContainerStarted","Data":"c9480771ea2a5c1664726800e44eff04cd70a60668f536d22f054d133b2e9b2d"} Apr 22 19:23:54.359017 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:54.358982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mbwdb" event={"ID":"60fd0bdf-71f1-4c96-a444-0de0f50d1c60","Type":"ContainerStarted","Data":"91ae18af2b40c2b81b58cf809f323eaafb09318cfae690a2cb407f8b1ca0b7ff"} Apr 22 19:23:54.374321 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:54.374275 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mbwdb" podStartSLOduration=33.803324124 podStartE2EDuration="37.374260482s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:50.23806019 +0000 UTC m=+50.660725478" lastFinishedPulling="2026-04-22 19:23:53.808996551 +0000 UTC m=+54.231661836" observedRunningTime="2026-04-22 19:23:54.373956997 +0000 UTC m=+54.796622304" watchObservedRunningTime="2026-04-22 19:23:54.374260482 +0000 UTC m=+54.796925789" Apr 22 19:23:57.312605 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:23:57.312578 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w57k8" Apr 22 19:24:04.354558 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:04.354514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:24:04.354558 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:04.354562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:24:04.355067 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.354671 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:04.355067 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.354696 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:04.355067 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.354717 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d578bf96b-fdww5: secret "image-registry-tls" not found Apr 22 19:24:04.355067 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.354732 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.354718336 +0000 UTC m=+96.777383621 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:24:04.355067 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.354769 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls podName:130af29b-4cd5-410b-b95b-ed57b79c76d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.354755545 +0000 UTC m=+96.777420830 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls") pod "image-registry-5d578bf96b-fdww5" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2") : secret "image-registry-tls" not found Apr 22 19:24:04.455310 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:04.455271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:24:04.455483 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:04.455322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:24:04.455483 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.455425 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:04.455483 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.455434 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:04.455598 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.455491 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert podName:0094a2f2-1687-4429-a919-c7f5d7498255 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.455473825 +0000 UTC m=+96.878139121 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert") pod "ingress-canary-vk4gw" (UID: "0094a2f2-1687-4429-a919-c7f5d7498255") : secret "canary-serving-cert" not found Apr 22 19:24:04.455598 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:04.455506 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls podName:d0674059-fa3f-4411-bea0-5b58dca69acc nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.45550087 +0000 UTC m=+96.878166160 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls") pod "dns-default-n2sn2" (UID: "d0674059-fa3f-4411-bea0-5b58dca69acc") : secret "dns-default-metrics-tls" not found Apr 22 19:24:05.865226 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:05.865190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:24:05.868287 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:05.868270 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:05.875676 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:05.875661 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:05.875743 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:05.875732 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:09.875701125 +0000 UTC m=+130.298366409 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : secret "metrics-daemon-secret" not found Apr 22 19:24:05.966073 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:05.966042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:24:05.969139 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:05.969123 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:05.979485 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:05.979464 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:05.989364 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:05.989335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7kcl\" (UniqueName: \"kubernetes.io/projected/7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb-kube-api-access-x7kcl\") pod \"network-check-target-wrvx6\" (UID: \"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb\") " pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:24:06.273255 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:06.273165 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l6zl6\"" Apr 22 19:24:06.280735 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:06.280714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:24:06.390097 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:06.390070 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wrvx6"] Apr 22 19:24:06.392950 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:24:06.392903 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ee8fcaa_f3c9_4f0e_b1c5_e6fe0fabbfeb.slice/crio-f6b11474d5b2326a41fa05960a824c9eefbe16abaa9205e79a112836b5cdde07 WatchSource:0}: Error finding container f6b11474d5b2326a41fa05960a824c9eefbe16abaa9205e79a112836b5cdde07: Status 404 returned error can't find the container with id f6b11474d5b2326a41fa05960a824c9eefbe16abaa9205e79a112836b5cdde07 Apr 22 19:24:07.388433 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:07.388383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wrvx6" event={"ID":"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb","Type":"ContainerStarted","Data":"f6b11474d5b2326a41fa05960a824c9eefbe16abaa9205e79a112836b5cdde07"} Apr 22 19:24:10.394640 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:10.394604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wrvx6" event={"ID":"7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb","Type":"ContainerStarted","Data":"dbe2d1cb4541229ca5d3361bbdb70131626be60290a80b169ca33073d380b616"} Apr 22 19:24:10.395088 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:10.394731 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:24:21.848349 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.848191 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wrvx6" podStartSLOduration=78.89764287 podStartE2EDuration="1m21.848158226s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:24:06.394698065 +0000 UTC m=+66.817363351" lastFinishedPulling="2026-04-22 19:24:09.345213418 +0000 UTC m=+69.767878707" observedRunningTime="2026-04-22 19:24:10.410791605 +0000 UTC m=+70.833456913" watchObservedRunningTime="2026-04-22 19:24:21.848158226 +0000 UTC m=+82.270823517" Apr 22 19:24:21.849313 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.849283 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5"] Apr 22 19:24:21.852515 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.852492 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:21.855561 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.855538 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 19:24:21.855667 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.855593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:24:21.855667 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.855646 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:24:21.856862 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.856847 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:24:21.862096 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.862074 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5"] Apr 22 19:24:21.985346 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.985305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c7zp\" (UniqueName: \"kubernetes.io/projected/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-kube-api-access-6c7zp\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:21.985517 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.985378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-klusterlet-config\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:21.985517 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:21.985397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-tmp\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:22.086659 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.086616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-klusterlet-config\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:22.086659 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.086657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-tmp\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:22.086864 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.086716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c7zp\" (UniqueName: \"kubernetes.io/projected/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-kube-api-access-6c7zp\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:22.087125 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.087104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-tmp\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:22.089089 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.089071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-klusterlet-config\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:22.094938 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.094890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c7zp\" (UniqueName: \"kubernetes.io/projected/3fec23ad-5280-4ad9-9edb-3e515ed95ff6-kube-api-access-6c7zp\") pod \"klusterlet-addon-workmgr-cc884686c-n8tk5\" (UID: \"3fec23ad-5280-4ad9-9edb-3e515ed95ff6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:22.161712 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.161688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:22.276898 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.276865 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5"] Apr 22 19:24:22.279772 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:24:22.279742 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fec23ad_5280_4ad9_9edb_3e515ed95ff6.slice/crio-b3e13cb62805ec5370a1b445bcaccc806cff1dae06ad509127bed10641546999 WatchSource:0}: Error finding container b3e13cb62805ec5370a1b445bcaccc806cff1dae06ad509127bed10641546999: Status 404 returned error can't find the container with id b3e13cb62805ec5370a1b445bcaccc806cff1dae06ad509127bed10641546999 Apr 22 19:24:22.416822 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:22.416732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" event={"ID":"3fec23ad-5280-4ad9-9edb-3e515ed95ff6","Type":"ContainerStarted","Data":"b3e13cb62805ec5370a1b445bcaccc806cff1dae06ad509127bed10641546999"} Apr 22 19:24:26.425291 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:26.425247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" event={"ID":"3fec23ad-5280-4ad9-9edb-3e515ed95ff6","Type":"ContainerStarted","Data":"44c899af3cade81e46390ce8d1a9c7b5efcc51d8cfffdea07cd1f6a625999517"} Apr 22 19:24:26.425670 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:26.425506 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:26.427120 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:26.427096 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" Apr 22 19:24:26.443443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:26.443404 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cc884686c-n8tk5" podStartSLOduration=2.069134636 podStartE2EDuration="5.443391736s" podCreationTimestamp="2026-04-22 19:24:21 +0000 UTC" firstStartedPulling="2026-04-22 19:24:22.281318262 +0000 UTC m=+82.703983551" lastFinishedPulling="2026-04-22 19:24:25.655575356 +0000 UTC m=+86.078240651" observedRunningTime="2026-04-22 19:24:26.442495265 +0000 UTC m=+86.865160556" watchObservedRunningTime="2026-04-22 19:24:26.443391736 +0000 UTC m=+86.866057040" Apr 22 19:24:36.395848 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:36.395809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:24:36.396306 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:36.395859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:24:36.396306 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.395967 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:36.396306 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.395988 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d578bf96b-fdww5: secret "image-registry-tls" not found Apr 22 19:24:36.396306 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.396011 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:36.396306 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.396056 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls podName:130af29b-4cd5-410b-b95b-ed57b79c76d2 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.396041492 +0000 UTC m=+160.818706776 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls") pod "image-registry-5d578bf96b-fdww5" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2") : secret "image-registry-tls" not found Apr 22 19:24:36.396306 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.396070 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.396063902 +0000 UTC m=+160.818729186 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:24:36.496582 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:36.496546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:24:36.496777 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:36.496599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:24:36.496777 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.496698 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:36.496777 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.496712 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:36.496777 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.496772 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert podName:0094a2f2-1687-4429-a919-c7f5d7498255 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.496753115 +0000 UTC m=+160.919418417 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert") pod "ingress-canary-vk4gw" (UID: "0094a2f2-1687-4429-a919-c7f5d7498255") : secret "canary-serving-cert" not found Apr 22 19:24:36.496964 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:24:36.496793 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls podName:d0674059-fa3f-4411-bea0-5b58dca69acc nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.496784682 +0000 UTC m=+160.919449999 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls") pod "dns-default-n2sn2" (UID: "d0674059-fa3f-4411-bea0-5b58dca69acc") : secret "dns-default-metrics-tls" not found Apr 22 19:24:41.399818 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:24:41.399785 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wrvx6" Apr 22 19:25:09.935828 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:09.935783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:25:09.936250 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:09.935945 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:09.936250 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:09.936022 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs podName:957d9773-bf39-486e-a32e-eba60e7b49e9 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:11.936007005 +0000 UTC m=+252.358672289 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs") pod "network-metrics-daemon-m8fmk" (UID: "957d9773-bf39-486e-a32e-eba60e7b49e9") : secret "metrics-daemon-secret" not found Apr 22 19:25:19.600904 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:19.600875 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hftfk_4caf5855-872c-4886-aec4-eb966cfeb4c3/dns-node-resolver/0.log" Apr 22 19:25:20.200974 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:20.200949 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-47nkl_7eb8b708-4ebf-4d5f-b8a0-ee69ff963778/node-ca/0.log" Apr 22 19:25:23.592500 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.592466 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m"] Apr 22 19:25:23.595200 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.595181 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.597883 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.597853 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 19:25:23.597883 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.597878 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 19:25:23.598055 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.597878 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:23.598055 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.597946 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:23.599114 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.599098 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-j2kq2\"" Apr 22 19:25:23.603261 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.603238 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m"] Apr 22 19:25:23.633006 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.632971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c0d1dd-5d1c-443a-a71d-40d163d60028-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.633150 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.633022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv5tm\" (UniqueName: \"kubernetes.io/projected/44c0d1dd-5d1c-443a-a71d-40d163d60028-kube-api-access-jv5tm\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.633150 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.633104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c0d1dd-5d1c-443a-a71d-40d163d60028-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.702589 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.702545 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98"] Apr 22 19:25:23.705529 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.705504 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8hr98"] Apr 22 19:25:23.705692 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.705672 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:23.708373 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.708355 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.708619 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.708596 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:23.708723 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.708601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:25:23.708989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.708972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:23.709077 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.709003 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-g7crm\"" Apr 22 19:25:23.711481 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.711462 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 19:25:23.711577 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.711515 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:23.711769 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.711755 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bl44j\"" Apr 22 19:25:23.711837 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.711782 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 19:25:23.712080 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.712067 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:23.717912 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.717892 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 19:25:23.718338 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.718318 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98"] Apr 22 19:25:23.719482 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.719272 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8hr98"] Apr 22 19:25:23.733415 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.733391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23701c89-5e40-43ee-bab7-fe2709643e97-trusted-ca\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.733506 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.733440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23701c89-5e40-43ee-bab7-fe2709643e97-config\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.733800 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.733472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:23.734117 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.734095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpxhg\" (UniqueName: \"kubernetes.io/projected/23701c89-5e40-43ee-bab7-fe2709643e97-kube-api-access-fpxhg\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.734282 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.734266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c0d1dd-5d1c-443a-a71d-40d163d60028-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.734413 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.734400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c0d1dd-5d1c-443a-a71d-40d163d60028-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.734519 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.734505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23701c89-5e40-43ee-bab7-fe2709643e97-serving-cert\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.734622 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.734609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv5tm\" (UniqueName: \"kubernetes.io/projected/44c0d1dd-5d1c-443a-a71d-40d163d60028-kube-api-access-jv5tm\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.734738 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.734726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9nf\" (UniqueName: \"kubernetes.io/projected/c82115e9-4eca-4c74-975e-c08cb3ed3c26-kube-api-access-2f9nf\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:23.735461 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.735440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c0d1dd-5d1c-443a-a71d-40d163d60028-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.738223 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.738194 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c0d1dd-5d1c-443a-a71d-40d163d60028-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.748591 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.748569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv5tm\" (UniqueName: \"kubernetes.io/projected/44c0d1dd-5d1c-443a-a71d-40d163d60028-kube-api-access-jv5tm\") pod \"kube-storage-version-migrator-operator-6769c5d45-9776m\" (UID: \"44c0d1dd-5d1c-443a-a71d-40d163d60028\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:23.835520 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.835486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23701c89-5e40-43ee-bab7-fe2709643e97-serving-cert\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.835520 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.835525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9nf\" (UniqueName: \"kubernetes.io/projected/c82115e9-4eca-4c74-975e-c08cb3ed3c26-kube-api-access-2f9nf\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:23.835718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.835649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23701c89-5e40-43ee-bab7-fe2709643e97-trusted-ca\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.835882 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.835862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23701c89-5e40-43ee-bab7-fe2709643e97-config\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.835946 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.835898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:23.836028 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:23.836004 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:23.836159 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.836062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpxhg\" (UniqueName: \"kubernetes.io/projected/23701c89-5e40-43ee-bab7-fe2709643e97-kube-api-access-fpxhg\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.836159 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:23.836081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls podName:c82115e9-4eca-4c74-975e-c08cb3ed3c26 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:24.336060361 +0000 UTC m=+144.758725668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tht98" (UID: "c82115e9-4eca-4c74-975e-c08cb3ed3c26") : secret "samples-operator-tls" not found Apr 22 19:25:23.836508 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.836488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23701c89-5e40-43ee-bab7-fe2709643e97-config\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.836568 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.836523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23701c89-5e40-43ee-bab7-fe2709643e97-trusted-ca\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.837878 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.837861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23701c89-5e40-43ee-bab7-fe2709643e97-serving-cert\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.846594 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.846536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpxhg\" (UniqueName: \"kubernetes.io/projected/23701c89-5e40-43ee-bab7-fe2709643e97-kube-api-access-fpxhg\") pod \"console-operator-9d4b6777b-8hr98\" (UID: \"23701c89-5e40-43ee-bab7-fe2709643e97\") " pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:23.846766 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.846744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9nf\" (UniqueName: \"kubernetes.io/projected/c82115e9-4eca-4c74-975e-c08cb3ed3c26-kube-api-access-2f9nf\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:23.904505 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:23.904464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" Apr 22 19:25:24.015632 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:24.015601 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m"] Apr 22 19:25:24.018473 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:24.018444 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c0d1dd_5d1c_443a_a71d_40d163d60028.slice/crio-7e1d467b0f1fe9b091c2ab087b57c704db5c23ffe9e53f57e874999faa79f6cc WatchSource:0}: Error finding container 7e1d467b0f1fe9b091c2ab087b57c704db5c23ffe9e53f57e874999faa79f6cc: Status 404 returned error can't find the container with id 7e1d467b0f1fe9b091c2ab087b57c704db5c23ffe9e53f57e874999faa79f6cc Apr 22 19:25:24.022745 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:24.022708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:24.134505 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:24.134468 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8hr98"] Apr 22 19:25:24.137002 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:24.136978 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23701c89_5e40_43ee_bab7_fe2709643e97.slice/crio-862631be87fd3078d2108c4dc9aa708b293968a5dcc84931117fcd8860cf191d WatchSource:0}: Error finding container 862631be87fd3078d2108c4dc9aa708b293968a5dcc84931117fcd8860cf191d: Status 404 returned error can't find the container with id 862631be87fd3078d2108c4dc9aa708b293968a5dcc84931117fcd8860cf191d Apr 22 19:25:24.339325 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:24.339288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:24.339503 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:24.339443 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:24.339550 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:24.339512 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls podName:c82115e9-4eca-4c74-975e-c08cb3ed3c26 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:25.339494017 +0000 UTC m=+145.762159302 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tht98" (UID: "c82115e9-4eca-4c74-975e-c08cb3ed3c26") : secret "samples-operator-tls" not found Apr 22 19:25:24.534328 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:24.534234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" event={"ID":"23701c89-5e40-43ee-bab7-fe2709643e97","Type":"ContainerStarted","Data":"862631be87fd3078d2108c4dc9aa708b293968a5dcc84931117fcd8860cf191d"} Apr 22 19:25:24.535068 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:24.535046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" event={"ID":"44c0d1dd-5d1c-443a-a71d-40d163d60028","Type":"ContainerStarted","Data":"7e1d467b0f1fe9b091c2ab087b57c704db5c23ffe9e53f57e874999faa79f6cc"} Apr 22 19:25:25.346489 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:25.346453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:25.346962 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:25.346606 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:25.346962 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:25.346672 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls podName:c82115e9-4eca-4c74-975e-c08cb3ed3c26 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:27.346654083 +0000 UTC m=+147.769319368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tht98" (UID: "c82115e9-4eca-4c74-975e-c08cb3ed3c26") : secret "samples-operator-tls" not found Apr 22 19:25:27.362654 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:27.362620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:27.363004 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:27.362788 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:27.363004 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:27.362868 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls podName:c82115e9-4eca-4c74-975e-c08cb3ed3c26 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:31.362846967 +0000 UTC m=+151.785512258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tht98" (UID: "c82115e9-4eca-4c74-975e-c08cb3ed3c26") : secret "samples-operator-tls" not found Apr 22 19:25:27.542155 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:27.542124 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/0.log" Apr 22 19:25:27.542330 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:27.542169 2576 generic.go:358] "Generic (PLEG): container finished" podID="23701c89-5e40-43ee-bab7-fe2709643e97" containerID="c2f65cec464464bb0d2d4c29fef9c98bc88b57b35d427552c5a5b1848b37a429" exitCode=255 Apr 22 19:25:27.542330 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:27.542258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" event={"ID":"23701c89-5e40-43ee-bab7-fe2709643e97","Type":"ContainerDied","Data":"c2f65cec464464bb0d2d4c29fef9c98bc88b57b35d427552c5a5b1848b37a429"} Apr 22 19:25:27.542553 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:27.542480 2576 scope.go:117] "RemoveContainer" containerID="c2f65cec464464bb0d2d4c29fef9c98bc88b57b35d427552c5a5b1848b37a429" Apr 22 19:25:27.543577 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:27.543553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" event={"ID":"44c0d1dd-5d1c-443a-a71d-40d163d60028","Type":"ContainerStarted","Data":"9952b503e626700b8648b3e99f86c95cfe7c93ee36da65136ed136d2ade0ca28"} Apr 22 19:25:27.575164 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:27.575123 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" podStartSLOduration=1.803614224 podStartE2EDuration="4.575110581s" podCreationTimestamp="2026-04-22 19:25:23 +0000 UTC" firstStartedPulling="2026-04-22 19:25:24.020292561 +0000 UTC m=+144.442957849" lastFinishedPulling="2026-04-22 19:25:26.791788918 +0000 UTC m=+147.214454206" observedRunningTime="2026-04-22 19:25:27.57430686 +0000 UTC m=+147.996972166" watchObservedRunningTime="2026-04-22 19:25:27.575110581 +0000 UTC m=+147.997775888" Apr 22 19:25:28.548040 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:28.548015 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:25:28.548433 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:28.548349 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/0.log" Apr 22 19:25:28.548433 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:28.548382 2576 generic.go:358] "Generic (PLEG): container finished" podID="23701c89-5e40-43ee-bab7-fe2709643e97" containerID="876c49b7ad7de9245500512e02cb90774e9717f3b232803b0fd5e5225ecd9f47" exitCode=255 Apr 22 19:25:28.548512 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:28.548466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" event={"ID":"23701c89-5e40-43ee-bab7-fe2709643e97","Type":"ContainerDied","Data":"876c49b7ad7de9245500512e02cb90774e9717f3b232803b0fd5e5225ecd9f47"} Apr 22 19:25:28.548545 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:28.548517 2576 scope.go:117] "RemoveContainer" containerID="c2f65cec464464bb0d2d4c29fef9c98bc88b57b35d427552c5a5b1848b37a429" Apr 22 19:25:28.548811 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:28.548794 2576 scope.go:117] "RemoveContainer" containerID="876c49b7ad7de9245500512e02cb90774e9717f3b232803b0fd5e5225ecd9f47" Apr 22 19:25:28.549076 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:28.549053 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8hr98_openshift-console-operator(23701c89-5e40-43ee-bab7-fe2709643e97)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" podUID="23701c89-5e40-43ee-bab7-fe2709643e97" Apr 22 19:25:29.552745 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.552715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:25:29.553219 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.553155 2576 scope.go:117] "RemoveContainer" containerID="876c49b7ad7de9245500512e02cb90774e9717f3b232803b0fd5e5225ecd9f47" Apr 22 19:25:29.553394 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:29.553371 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8hr98_openshift-console-operator(23701c89-5e40-43ee-bab7-fe2709643e97)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" podUID="23701c89-5e40-43ee-bab7-fe2709643e97" Apr 22 19:25:29.985210 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.985176 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fg9cf"] Apr 22 19:25:29.989222 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.989200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:29.992032 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.991998 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 19:25:29.992158 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.992001 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 19:25:29.992158 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.992044 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-k62p9\"" Apr 22 19:25:29.992158 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.992056 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 19:25:29.992158 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.992034 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 19:25:29.996766 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:29.996746 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fg9cf"] Apr 22 19:25:30.083181 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.083148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1fd13ccf-c266-4251-ae2d-f301a97c9d51-signing-key\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.083334 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.083267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1fd13ccf-c266-4251-ae2d-f301a97c9d51-signing-cabundle\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.083334 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.083327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ch5k\" (UniqueName: \"kubernetes.io/projected/1fd13ccf-c266-4251-ae2d-f301a97c9d51-kube-api-access-7ch5k\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.183859 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.183825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1fd13ccf-c266-4251-ae2d-f301a97c9d51-signing-cabundle\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.184059 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.183935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ch5k\" (UniqueName: \"kubernetes.io/projected/1fd13ccf-c266-4251-ae2d-f301a97c9d51-kube-api-access-7ch5k\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.184059 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.183965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1fd13ccf-c266-4251-ae2d-f301a97c9d51-signing-key\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.184462 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.184436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1fd13ccf-c266-4251-ae2d-f301a97c9d51-signing-cabundle\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.186193 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.186175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1fd13ccf-c266-4251-ae2d-f301a97c9d51-signing-key\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.192055 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.192033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ch5k\" (UniqueName: \"kubernetes.io/projected/1fd13ccf-c266-4251-ae2d-f301a97c9d51-kube-api-access-7ch5k\") pod \"service-ca-865cb79987-fg9cf\" (UID: \"1fd13ccf-c266-4251-ae2d-f301a97c9d51\") " pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.298609 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.298515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fg9cf" Apr 22 19:25:30.412868 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.412841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fg9cf"] Apr 22 19:25:30.416003 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:30.415976 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fd13ccf_c266_4251_ae2d_f301a97c9d51.slice/crio-b8e7bc4fd2d3806e5e22771431d85e93b94e5aa9c335244e07e4d3dc43a7a4a7 WatchSource:0}: Error finding container b8e7bc4fd2d3806e5e22771431d85e93b94e5aa9c335244e07e4d3dc43a7a4a7: Status 404 returned error can't find the container with id b8e7bc4fd2d3806e5e22771431d85e93b94e5aa9c335244e07e4d3dc43a7a4a7 Apr 22 19:25:30.556729 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:30.556646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fg9cf" event={"ID":"1fd13ccf-c266-4251-ae2d-f301a97c9d51","Type":"ContainerStarted","Data":"b8e7bc4fd2d3806e5e22771431d85e93b94e5aa9c335244e07e4d3dc43a7a4a7"} Apr 22 19:25:31.392388 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:31.392353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:31.392566 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:31.392529 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:25:31.392620 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:31.392611 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls podName:c82115e9-4eca-4c74-975e-c08cb3ed3c26 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:39.392589114 +0000 UTC m=+159.815254406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tht98" (UID: "c82115e9-4eca-4c74-975e-c08cb3ed3c26") : secret "samples-operator-tls" not found Apr 22 19:25:32.562311 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:32.562223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fg9cf" event={"ID":"1fd13ccf-c266-4251-ae2d-f301a97c9d51","Type":"ContainerStarted","Data":"b117830e170e8eda16b41231b37e6384dbf5e0a9d3c5a0192e31675ad492413e"} Apr 22 19:25:32.578760 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:32.578716 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-fg9cf" podStartSLOduration=1.818647156 podStartE2EDuration="3.578704022s" podCreationTimestamp="2026-04-22 19:25:29 +0000 UTC" firstStartedPulling="2026-04-22 19:25:30.417810433 +0000 UTC m=+150.840475721" lastFinishedPulling="2026-04-22 19:25:32.177867298 +0000 UTC m=+152.600532587" observedRunningTime="2026-04-22 19:25:32.578617275 +0000 UTC m=+153.001282584" watchObservedRunningTime="2026-04-22 19:25:32.578704022 +0000 UTC m=+153.001369328" Apr 22 19:25:34.023418 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:34.023376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:34.023418 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:34.023416 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:34.023821 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:34.023763 2576 scope.go:117] "RemoveContainer" containerID="876c49b7ad7de9245500512e02cb90774e9717f3b232803b0fd5e5225ecd9f47" Apr 22 19:25:34.023964 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:34.023945 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8hr98_openshift-console-operator(23701c89-5e40-43ee-bab7-fe2709643e97)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" podUID="23701c89-5e40-43ee-bab7-fe2709643e97" Apr 22 19:25:35.500729 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:35.500679 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" podUID="130af29b-4cd5-410b-b95b-ed57b79c76d2" Apr 22 19:25:35.518009 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:35.517966 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" podUID="18fa20f4-e79f-4f01-9142-38e98b2350d6" Apr 22 19:25:35.530868 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:35.530823 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-n2sn2" podUID="d0674059-fa3f-4411-bea0-5b58dca69acc" Apr 22 19:25:35.545998 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:35.545965 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vk4gw" podUID="0094a2f2-1687-4429-a919-c7f5d7498255" Apr 22 19:25:35.570458 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:35.570432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:25:35.570458 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:35.570467 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:25:35.570619 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:35.570474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n2sn2" Apr 22 19:25:37.164092 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:37.164047 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-m8fmk" podUID="957d9773-bf39-486e-a32e-eba60e7b49e9" Apr 22 19:25:39.458703 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:39.458666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:39.460996 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:39.460973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c82115e9-4eca-4c74-975e-c08cb3ed3c26-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tht98\" (UID: \"c82115e9-4eca-4c74-975e-c08cb3ed3c26\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:39.615803 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:39.615767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" Apr 22 19:25:39.742395 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:39.742316 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98"] Apr 22 19:25:40.467104 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.467070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:25:40.467596 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.467136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:25:40.467596 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:40.467241 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:25:40.467596 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:25:40.467310 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert podName:18fa20f4-e79f-4f01-9142-38e98b2350d6 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:42.467291729 +0000 UTC m=+282.889957018 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9f56t" (UID: "18fa20f4-e79f-4f01-9142-38e98b2350d6") : secret "networking-console-plugin-cert" not found Apr 22 19:25:40.469659 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.469637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"image-registry-5d578bf96b-fdww5\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:25:40.568505 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.568470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:25:40.568505 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.568511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:25:40.570773 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.570751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0674059-fa3f-4411-bea0-5b58dca69acc-metrics-tls\") pod \"dns-default-n2sn2\" (UID: \"d0674059-fa3f-4411-bea0-5b58dca69acc\") " pod="openshift-dns/dns-default-n2sn2" Apr 22 19:25:40.570900 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.570882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0094a2f2-1687-4429-a919-c7f5d7498255-cert\") pod \"ingress-canary-vk4gw\" (UID: \"0094a2f2-1687-4429-a919-c7f5d7498255\") " pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:25:40.582957 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.582913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" event={"ID":"c82115e9-4eca-4c74-975e-c08cb3ed3c26","Type":"ContainerStarted","Data":"f5e74567e6297e58ad38ffe84ca1a4f9af6f4745d4979ec5bd40e08f0ff54f8b"} Apr 22 19:25:40.674498 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.674465 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2jtv7\"" Apr 22 19:25:40.674728 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.674680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8zlb\"" Apr 22 19:25:40.681169 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.681130 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:25:40.681282 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.681210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n2sn2" Apr 22 19:25:40.849828 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.849787 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d578bf96b-fdww5"] Apr 22 19:25:40.853336 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:40.853313 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n2sn2"] Apr 22 19:25:40.854910 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:40.854882 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod130af29b_4cd5_410b_b95b_ed57b79c76d2.slice/crio-ac1bf49c21c695f453d40195eaa3e9320a130cd9e2b5a5347424c0da9f08a4d3 WatchSource:0}: Error finding container ac1bf49c21c695f453d40195eaa3e9320a130cd9e2b5a5347424c0da9f08a4d3: Status 404 returned error can't find the container with id ac1bf49c21c695f453d40195eaa3e9320a130cd9e2b5a5347424c0da9f08a4d3 Apr 22 19:25:40.855616 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:40.855587 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0674059_fa3f_4411_bea0_5b58dca69acc.slice/crio-0b489a2a9ad3b2dcc9e4b8655056e971a4577c3a7b0a927ab502ab06e6ff3098 WatchSource:0}: Error finding container 0b489a2a9ad3b2dcc9e4b8655056e971a4577c3a7b0a927ab502ab06e6ff3098: Status 404 returned error can't find the container with id 0b489a2a9ad3b2dcc9e4b8655056e971a4577c3a7b0a927ab502ab06e6ff3098 Apr 22 19:25:41.586891 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:41.586861 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n2sn2" event={"ID":"d0674059-fa3f-4411-bea0-5b58dca69acc","Type":"ContainerStarted","Data":"0b489a2a9ad3b2dcc9e4b8655056e971a4577c3a7b0a927ab502ab06e6ff3098"} Apr 22 19:25:41.588750 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:41.588723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" event={"ID":"130af29b-4cd5-410b-b95b-ed57b79c76d2","Type":"ContainerStarted","Data":"2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08"} Apr 22 19:25:41.588750 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:41.588758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" event={"ID":"130af29b-4cd5-410b-b95b-ed57b79c76d2","Type":"ContainerStarted","Data":"ac1bf49c21c695f453d40195eaa3e9320a130cd9e2b5a5347424c0da9f08a4d3"} Apr 22 19:25:41.588904 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:41.588888 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:25:41.609559 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:41.609517 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" podStartSLOduration=161.609504098 podStartE2EDuration="2m41.609504098s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:25:41.608560082 +0000 UTC m=+162.031225392" watchObservedRunningTime="2026-04-22 19:25:41.609504098 +0000 UTC m=+162.032169404" Apr 22 19:25:42.595541 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:42.595510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" event={"ID":"c82115e9-4eca-4c74-975e-c08cb3ed3c26","Type":"ContainerStarted","Data":"c3ba95d10f98467154a17cf5ab1e2367d1257fc8bf2d5479c290f783bb4d7733"} Apr 22 19:25:42.595954 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:42.595548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" event={"ID":"c82115e9-4eca-4c74-975e-c08cb3ed3c26","Type":"ContainerStarted","Data":"ae6f1063b21d5e1a667b3d1cc6ed933700b92e20974156081e7f9f0abb696723"} Apr 22 19:25:42.612395 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:42.612341 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tht98" podStartSLOduration=17.821746153 podStartE2EDuration="19.61231459s" podCreationTimestamp="2026-04-22 19:25:23 +0000 UTC" firstStartedPulling="2026-04-22 19:25:39.790648529 +0000 UTC m=+160.213313813" lastFinishedPulling="2026-04-22 19:25:41.581216963 +0000 UTC m=+162.003882250" observedRunningTime="2026-04-22 19:25:42.611813933 +0000 UTC m=+163.034479243" watchObservedRunningTime="2026-04-22 19:25:42.61231459 +0000 UTC m=+163.034979898" Apr 22 19:25:43.599900 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:43.599860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n2sn2" event={"ID":"d0674059-fa3f-4411-bea0-5b58dca69acc","Type":"ContainerStarted","Data":"ad9d4142719d334ca742f9ec77ff0940628cba33dae0ab833f746faca05c2079"} Apr 22 19:25:43.599900 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:43.599897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n2sn2" event={"ID":"d0674059-fa3f-4411-bea0-5b58dca69acc","Type":"ContainerStarted","Data":"25e9eec0599f5365ca0951fccdd96510f4cca32f7491e9cd51449211948f4445"} Apr 22 19:25:43.621591 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:43.621543 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n2sn2" podStartSLOduration=129.896686161 podStartE2EDuration="2m11.621531336s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:25:40.857650335 +0000 UTC m=+161.280315623" lastFinishedPulling="2026-04-22 19:25:42.582495513 +0000 UTC m=+163.005160798" observedRunningTime="2026-04-22 19:25:43.619699492 +0000 UTC m=+164.042364799" watchObservedRunningTime="2026-04-22 19:25:43.621531336 +0000 UTC m=+164.044196642" Apr 22 19:25:44.603239 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:44.603208 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-n2sn2" Apr 22 19:25:48.151844 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:48.151812 2576 scope.go:117] "RemoveContainer" containerID="876c49b7ad7de9245500512e02cb90774e9717f3b232803b0fd5e5225ecd9f47" Apr 22 19:25:48.615832 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:48.615802 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:25:48.616043 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:48.615873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" event={"ID":"23701c89-5e40-43ee-bab7-fe2709643e97","Type":"ContainerStarted","Data":"fa6958e5e4d186ee57aaff1a82295c56cb41ec4ffcdbb53d6e9a1ef120b4a22b"} Apr 22 19:25:48.616236 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:48.616215 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:48.633522 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:48.633479 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" podStartSLOduration=22.983661865 podStartE2EDuration="25.633466174s" podCreationTimestamp="2026-04-22 19:25:23 +0000 UTC" firstStartedPulling="2026-04-22 19:25:24.138944673 +0000 UTC m=+144.561609973" lastFinishedPulling="2026-04-22 19:25:26.788748991 +0000 UTC m=+147.211414282" observedRunningTime="2026-04-22 19:25:48.632706437 +0000 UTC m=+169.055371744" watchObservedRunningTime="2026-04-22 19:25:48.633466174 +0000 UTC m=+169.056131511" Apr 22 19:25:48.840863 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:48.840832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8hr98" Apr 22 19:25:50.152407 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:50.152338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:25:50.152407 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:50.152401 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:25:50.155514 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:50.155497 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8fgp\"" Apr 22 19:25:50.163829 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:50.163813 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vk4gw" Apr 22 19:25:50.278648 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:50.278617 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vk4gw"] Apr 22 19:25:50.282225 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:50.282199 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0094a2f2_1687_4429_a919_c7f5d7498255.slice/crio-a43d6dc3c4f481adb9564ea6d82ef6ab2ec041353006887ea76d49fd189ac406 WatchSource:0}: Error finding container a43d6dc3c4f481adb9564ea6d82ef6ab2ec041353006887ea76d49fd189ac406: Status 404 returned error can't find the container with id a43d6dc3c4f481adb9564ea6d82ef6ab2ec041353006887ea76d49fd189ac406 Apr 22 19:25:50.624486 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:50.624447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vk4gw" event={"ID":"0094a2f2-1687-4429-a919-c7f5d7498255","Type":"ContainerStarted","Data":"a43d6dc3c4f481adb9564ea6d82ef6ab2ec041353006887ea76d49fd189ac406"} Apr 22 19:25:52.498852 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.498819 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fn6js"] Apr 22 19:25:52.502359 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.502339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.505525 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.505505 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:25:52.506741 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.506725 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:25:52.506990 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.506972 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:25:52.507063 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.506982 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-np8h6\"" Apr 22 19:25:52.507122 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.506997 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:25:52.515299 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.515278 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fn6js"] Apr 22 19:25:52.579709 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.579680 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk"] Apr 22 19:25:52.582618 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.582600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk" Apr 22 19:25:52.584811 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.584783 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl"] Apr 22 19:25:52.588841 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.588816 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-swq4t"] Apr 22 19:25:52.588971 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.588866 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-9hknq\"" Apr 22 19:25:52.588971 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.588962 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" Apr 22 19:25:52.591525 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.591507 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-9gw97\"" Apr 22 19:25:52.592590 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.592577 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 19:25:52.592880 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.592865 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-swq4t" Apr 22 19:25:52.594725 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.594705 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk"] Apr 22 19:25:52.596653 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.596637 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-xgnth\"" Apr 22 19:25:52.596737 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.596675 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:25:52.597849 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.597833 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:25:52.607552 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.607534 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl"] Apr 22 19:25:52.608963 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.608941 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-swq4t"] Apr 22 19:25:52.632372 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.632325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vk4gw" event={"ID":"0094a2f2-1687-4429-a919-c7f5d7498255","Type":"ContainerStarted","Data":"f9ae29a9105d9a88ad65153a011a813706439c0c3072bed2e7a96a52532d768e"} Apr 22 19:25:52.671305 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.671270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5ecb5e1b-abd7-4365-a2b6-55632e29bd79-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-j24jl\" (UID: \"5ecb5e1b-abd7-4365-a2b6-55632e29bd79\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" Apr 22 19:25:52.671305 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.671308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37daebbc-30c3-4548-b751-2f66c70271fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.671504 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.671326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37daebbc-30c3-4548-b751-2f66c70271fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.671504 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.671349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmzzq\" (UniqueName: \"kubernetes.io/projected/dd892729-a03a-4e23-814f-c6d8f9c486d8-kube-api-access-bmzzq\") pod \"network-check-source-8894fc9bd-dppdk\" (UID: \"dd892729-a03a-4e23-814f-c6d8f9c486d8\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk" Apr 22 19:25:52.671504 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.671399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37daebbc-30c3-4548-b751-2f66c70271fa-data-volume\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.671504 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.671471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37daebbc-30c3-4548-b751-2f66c70271fa-crio-socket\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.671504 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.671491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67l6\" (UniqueName: \"kubernetes.io/projected/37daebbc-30c3-4548-b751-2f66c70271fa-kube-api-access-f67l6\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.671653 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.671531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hrp\" (UniqueName: \"kubernetes.io/projected/fd57b9ed-3208-41fd-aab6-8c6d3078a852-kube-api-access-n5hrp\") pod \"downloads-6bcc868b7-swq4t\" (UID: \"fd57b9ed-3208-41fd-aab6-8c6d3078a852\") " pod="openshift-console/downloads-6bcc868b7-swq4t" Apr 22 19:25:52.676017 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.675978 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vk4gw" podStartSLOduration=138.977264911 podStartE2EDuration="2m20.675965848s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:25:50.284545754 +0000 UTC m=+170.707211043" lastFinishedPulling="2026-04-22 19:25:51.983246689 +0000 UTC m=+172.405911980" observedRunningTime="2026-04-22 19:25:52.675681437 +0000 UTC m=+173.098346745" watchObservedRunningTime="2026-04-22 19:25:52.675965848 +0000 UTC m=+173.098631154" Apr 22 19:25:52.772948 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.772824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37daebbc-30c3-4548-b751-2f66c70271fa-crio-socket\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.772948 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.772868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f67l6\" (UniqueName: \"kubernetes.io/projected/37daebbc-30c3-4548-b751-2f66c70271fa-kube-api-access-f67l6\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.773180 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.772992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37daebbc-30c3-4548-b751-2f66c70271fa-crio-socket\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.773180 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.773107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hrp\" (UniqueName: \"kubernetes.io/projected/fd57b9ed-3208-41fd-aab6-8c6d3078a852-kube-api-access-n5hrp\") pod \"downloads-6bcc868b7-swq4t\" (UID: \"fd57b9ed-3208-41fd-aab6-8c6d3078a852\") " pod="openshift-console/downloads-6bcc868b7-swq4t" Apr 22 19:25:52.773180 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.773145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5ecb5e1b-abd7-4365-a2b6-55632e29bd79-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-j24jl\" (UID: \"5ecb5e1b-abd7-4365-a2b6-55632e29bd79\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" Apr 22 19:25:52.773180 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.773169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37daebbc-30c3-4548-b751-2f66c70271fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.773374 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.773293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37daebbc-30c3-4548-b751-2f66c70271fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.773374 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.773326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmzzq\" (UniqueName: \"kubernetes.io/projected/dd892729-a03a-4e23-814f-c6d8f9c486d8-kube-api-access-bmzzq\") pod \"network-check-source-8894fc9bd-dppdk\" (UID: \"dd892729-a03a-4e23-814f-c6d8f9c486d8\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk" Apr 22 19:25:52.773476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.773375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37daebbc-30c3-4548-b751-2f66c70271fa-data-volume\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.773748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.773729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37daebbc-30c3-4548-b751-2f66c70271fa-data-volume\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.773939 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.773899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37daebbc-30c3-4548-b751-2f66c70271fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.775595 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.775568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5ecb5e1b-abd7-4365-a2b6-55632e29bd79-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-j24jl\" (UID: \"5ecb5e1b-abd7-4365-a2b6-55632e29bd79\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" Apr 22 19:25:52.775700 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.775612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37daebbc-30c3-4548-b751-2f66c70271fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.785726 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.785696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmzzq\" (UniqueName: \"kubernetes.io/projected/dd892729-a03a-4e23-814f-c6d8f9c486d8-kube-api-access-bmzzq\") pod \"network-check-source-8894fc9bd-dppdk\" (UID: \"dd892729-a03a-4e23-814f-c6d8f9c486d8\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk" Apr 22 19:25:52.785990 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.785972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hrp\" (UniqueName: \"kubernetes.io/projected/fd57b9ed-3208-41fd-aab6-8c6d3078a852-kube-api-access-n5hrp\") pod \"downloads-6bcc868b7-swq4t\" (UID: \"fd57b9ed-3208-41fd-aab6-8c6d3078a852\") " pod="openshift-console/downloads-6bcc868b7-swq4t" Apr 22 19:25:52.786208 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.786192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67l6\" (UniqueName: \"kubernetes.io/projected/37daebbc-30c3-4548-b751-2f66c70271fa-kube-api-access-f67l6\") pod \"insights-runtime-extractor-fn6js\" (UID: \"37daebbc-30c3-4548-b751-2f66c70271fa\") " pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.811430 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.811405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fn6js" Apr 22 19:25:52.892213 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.892183 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk" Apr 22 19:25:52.899738 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.899603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" Apr 22 19:25:52.904452 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.904424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-swq4t" Apr 22 19:25:52.940374 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:52.940134 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fn6js"] Apr 22 19:25:52.945726 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:52.945662 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37daebbc_30c3_4548_b751_2f66c70271fa.slice/crio-08f31009abf6283943b7a952d4b16e5a082189528103e627cb5551dceafeed0f WatchSource:0}: Error finding container 08f31009abf6283943b7a952d4b16e5a082189528103e627cb5551dceafeed0f: Status 404 returned error can't find the container with id 08f31009abf6283943b7a952d4b16e5a082189528103e627cb5551dceafeed0f Apr 22 19:25:53.032525 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.032497 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk"] Apr 22 19:25:53.036539 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:53.036479 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd892729_a03a_4e23_814f_c6d8f9c486d8.slice/crio-a0450c8762ea1e65b953c260febe18546c0f1a7d966b5ee5f80833ebf7505e3c WatchSource:0}: Error finding container a0450c8762ea1e65b953c260febe18546c0f1a7d966b5ee5f80833ebf7505e3c: Status 404 returned error can't find the container with id a0450c8762ea1e65b953c260febe18546c0f1a7d966b5ee5f80833ebf7505e3c Apr 22 19:25:53.054272 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.054129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl"] Apr 22 19:25:53.057888 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:53.057851 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ecb5e1b_abd7_4365_a2b6_55632e29bd79.slice/crio-36a6788d03f0e49923ae658bf2d325ac7b8197e65e75730a063edf5bb9831b15 WatchSource:0}: Error finding container 36a6788d03f0e49923ae658bf2d325ac7b8197e65e75730a063edf5bb9831b15: Status 404 returned error can't find the container with id 36a6788d03f0e49923ae658bf2d325ac7b8197e65e75730a063edf5bb9831b15 Apr 22 19:25:53.074914 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.074886 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-swq4t"] Apr 22 19:25:53.079371 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:25:53.079346 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd57b9ed_3208_41fd_aab6_8c6d3078a852.slice/crio-954688ff4178b014623459d8c83f613bdd93689a6c4c818cbedbbb9b5e609142 WatchSource:0}: Error finding container 954688ff4178b014623459d8c83f613bdd93689a6c4c818cbedbbb9b5e609142: Status 404 returned error can't find the container with id 954688ff4178b014623459d8c83f613bdd93689a6c4c818cbedbbb9b5e609142 Apr 22 19:25:53.636803 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.636767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fn6js" event={"ID":"37daebbc-30c3-4548-b751-2f66c70271fa","Type":"ContainerStarted","Data":"2b81409e93b4bd0dca1213a4f66995b2cca2f67dc0bb4d4d5eb7d187197b7667"} Apr 22 19:25:53.637216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.636822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fn6js" event={"ID":"37daebbc-30c3-4548-b751-2f66c70271fa","Type":"ContainerStarted","Data":"08f31009abf6283943b7a952d4b16e5a082189528103e627cb5551dceafeed0f"} Apr 22 19:25:53.637811 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.637783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-swq4t" event={"ID":"fd57b9ed-3208-41fd-aab6-8c6d3078a852","Type":"ContainerStarted","Data":"954688ff4178b014623459d8c83f613bdd93689a6c4c818cbedbbb9b5e609142"} Apr 22 19:25:53.638774 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.638752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" event={"ID":"5ecb5e1b-abd7-4365-a2b6-55632e29bd79","Type":"ContainerStarted","Data":"36a6788d03f0e49923ae658bf2d325ac7b8197e65e75730a063edf5bb9831b15"} Apr 22 19:25:53.639963 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.639942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk" event={"ID":"dd892729-a03a-4e23-814f-c6d8f9c486d8","Type":"ContainerStarted","Data":"b06bbe50b5a77334539b6fb137e743656c510c3dc885e6981a9fbfa120f10aac"} Apr 22 19:25:53.640058 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.639970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk" event={"ID":"dd892729-a03a-4e23-814f-c6d8f9c486d8","Type":"ContainerStarted","Data":"a0450c8762ea1e65b953c260febe18546c0f1a7d966b5ee5f80833ebf7505e3c"} Apr 22 19:25:53.656139 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:53.656082 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dppdk" podStartSLOduration=1.656063568 podStartE2EDuration="1.656063568s" podCreationTimestamp="2026-04-22 19:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:25:53.65494118 +0000 UTC m=+174.077606492" watchObservedRunningTime="2026-04-22 19:25:53.656063568 +0000 UTC m=+174.078728877" Apr 22 19:25:54.610251 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:54.610223 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n2sn2" Apr 22 19:25:54.646434 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:54.645966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" event={"ID":"5ecb5e1b-abd7-4365-a2b6-55632e29bd79","Type":"ContainerStarted","Data":"d3eff4dc334ffd9c09fc5df253b191657ae903e4a10e5879ad5fe49e4ef57b9b"} Apr 22 19:25:54.646434 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:54.646402 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" Apr 22 19:25:54.651406 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:54.651377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fn6js" event={"ID":"37daebbc-30c3-4548-b751-2f66c70271fa","Type":"ContainerStarted","Data":"1075956c792431893df94f3c61426be23db9314a7855ecb3543146d44be46cc2"} Apr 22 19:25:54.652814 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:54.652791 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" Apr 22 19:25:54.662029 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:54.661951 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j24jl" podStartSLOduration=1.372883035 podStartE2EDuration="2.661935481s" podCreationTimestamp="2026-04-22 19:25:52 +0000 UTC" firstStartedPulling="2026-04-22 19:25:53.06069471 +0000 UTC m=+173.483359998" lastFinishedPulling="2026-04-22 19:25:54.349747158 +0000 UTC m=+174.772412444" observedRunningTime="2026-04-22 19:25:54.661403611 +0000 UTC m=+175.084068919" watchObservedRunningTime="2026-04-22 19:25:54.661935481 +0000 UTC m=+175.084600779" Apr 22 19:25:55.655642 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:55.655605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fn6js" event={"ID":"37daebbc-30c3-4548-b751-2f66c70271fa","Type":"ContainerStarted","Data":"c43e1955b483b01121c7d5994dc7945756446e39fcac05025b83adafdaf34a38"} Apr 22 19:25:55.674066 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:55.673991 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fn6js" podStartSLOduration=1.5697938059999998 podStartE2EDuration="3.673976406s" podCreationTimestamp="2026-04-22 19:25:52 +0000 UTC" firstStartedPulling="2026-04-22 19:25:53.035560555 +0000 UTC m=+173.458225854" lastFinishedPulling="2026-04-22 19:25:55.139743166 +0000 UTC m=+175.562408454" observedRunningTime="2026-04-22 19:25:55.673012875 +0000 UTC m=+176.095678173" watchObservedRunningTime="2026-04-22 19:25:55.673976406 +0000 UTC m=+176.096641713" Apr 22 19:25:59.863940 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.863890 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8dn6z"] Apr 22 19:25:59.868331 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.868308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.871865 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.871239 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:25:59.871865 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.871681 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:25:59.872857 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.872692 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:25:59.872857 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.872724 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:25:59.873049 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.872868 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:25:59.873049 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.873007 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:25:59.873151 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.873128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nfpmc\"" Apr 22 19:25:59.938688 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.938648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-wtmp\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.938843 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.938692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.938843 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.938741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-sys\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.938843 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.938777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.938843 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.938804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-textfile\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.939074 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.938851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-root\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.939074 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.938887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-tls\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.939074 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.939000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c24fec81-fd18-43a5-884d-38c5bb7a71ab-metrics-client-ca\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:25:59.939074 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:25:59.939070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmcss\" (UniqueName: \"kubernetes.io/projected/c24fec81-fd18-43a5-884d-38c5bb7a71ab-kube-api-access-cmcss\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-wtmp\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-sys\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-textfile\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-root\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-tls\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-wtmp\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c24fec81-fd18-43a5-884d-38c5bb7a71ab-metrics-client-ca\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-root\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmcss\" (UniqueName: \"kubernetes.io/projected/c24fec81-fd18-43a5-884d-38c5bb7a71ab-kube-api-access-cmcss\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.039850 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.039762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c24fec81-fd18-43a5-884d-38c5bb7a71ab-sys\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.040750 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.040297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-textfile\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.044499 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.044474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:26:00.044646 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.044620 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:26:00.044646 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.044632 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:26:00.044790 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.044724 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:26:00.050678 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.050651 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c24fec81-fd18-43a5-884d-38c5bb7a71ab-metrics-client-ca\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.050678 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.050668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.052335 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.052307 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:26:00.052544 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.052515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.052801 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.052783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c24fec81-fd18-43a5-884d-38c5bb7a71ab-node-exporter-tls\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.063166 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.063120 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:26:00.073682 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.073646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmcss\" (UniqueName: \"kubernetes.io/projected/c24fec81-fd18-43a5-884d-38c5bb7a71ab-kube-api-access-cmcss\") pod \"node-exporter-8dn6z\" (UID: \"c24fec81-fd18-43a5-884d-38c5bb7a71ab\") " pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.184274 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.184199 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nfpmc\"" Apr 22 19:26:00.192691 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.192661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8dn6z" Apr 22 19:26:00.202409 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:26:00.202376 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24fec81_fd18_43a5_884d_38c5bb7a71ab.slice/crio-c8b95d7ccd54ce178d7d670096162d08a4850ad199e22a348c86c1ff89407b89 WatchSource:0}: Error finding container c8b95d7ccd54ce178d7d670096162d08a4850ad199e22a348c86c1ff89407b89: Status 404 returned error can't find the container with id c8b95d7ccd54ce178d7d670096162d08a4850ad199e22a348c86c1ff89407b89 Apr 22 19:26:00.670938 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.670881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8dn6z" event={"ID":"c24fec81-fd18-43a5-884d-38c5bb7a71ab","Type":"ContainerStarted","Data":"c8b95d7ccd54ce178d7d670096162d08a4850ad199e22a348c86c1ff89407b89"} Apr 22 19:26:00.686534 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.686491 2576 patch_prober.go:28] interesting pod/image-registry-5d578bf96b-fdww5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:26:00.686671 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:00.686560 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" podUID="130af29b-4cd5-410b-b95b-ed57b79c76d2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:26:01.675679 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:01.675625 2576 generic.go:358] "Generic (PLEG): container finished" podID="c24fec81-fd18-43a5-884d-38c5bb7a71ab" containerID="6026768f0d7f9cf65433126be7a9f31763010bdbf1d2fe8524b4ba4a216a94c4" exitCode=0 Apr 22 19:26:01.676094 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:01.675715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8dn6z" event={"ID":"c24fec81-fd18-43a5-884d-38c5bb7a71ab","Type":"ContainerDied","Data":"6026768f0d7f9cf65433126be7a9f31763010bdbf1d2fe8524b4ba4a216a94c4"} Apr 22 19:26:02.599546 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.599514 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:26:02.681340 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.681303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8dn6z" event={"ID":"c24fec81-fd18-43a5-884d-38c5bb7a71ab","Type":"ContainerStarted","Data":"b399b6ad0cb1833fda11c877891ea98785aac030cf4112e28125e0d05e87ecea"} Apr 22 19:26:02.681340 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.681340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8dn6z" event={"ID":"c24fec81-fd18-43a5-884d-38c5bb7a71ab","Type":"ContainerStarted","Data":"64470b6a24362bf41e7eed8c6869cafb2280463120d84d6688c9246d4676d3e8"} Apr 22 19:26:02.700691 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.700635 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8dn6z" podStartSLOduration=2.884493324 podStartE2EDuration="3.70061688s" podCreationTimestamp="2026-04-22 19:25:59 +0000 UTC" firstStartedPulling="2026-04-22 19:26:00.204499618 +0000 UTC m=+180.627164906" lastFinishedPulling="2026-04-22 19:26:01.020623174 +0000 UTC m=+181.443288462" observedRunningTime="2026-04-22 19:26:02.699057665 +0000 UTC m=+183.121722971" watchObservedRunningTime="2026-04-22 19:26:02.70061688 +0000 UTC m=+183.123282188" Apr 22 19:26:02.921476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.921442 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5c999d4f58-km8q5"] Apr 22 19:26:02.924932 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.924907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:02.927743 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.927712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bg268td9qaatm\"" Apr 22 19:26:02.927743 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.927726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 19:26:02.927952 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.927721 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-6kdml\"" Apr 22 19:26:02.928008 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.927969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 19:26:02.928008 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.927998 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 19:26:02.928105 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.928042 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 19:26:02.928275 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.928251 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 19:26:02.936047 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.936029 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c999d4f58-km8q5"] Apr 22 19:26:02.966438 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.966399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:02.966615 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.966448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:02.966615 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.966555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73cfb30c-6eb9-4213-879f-37b04ae3abe9-metrics-client-ca\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:02.966615 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.966618 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-grpc-tls\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:02.966799 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.966664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5qw\" (UniqueName: \"kubernetes.io/projected/73cfb30c-6eb9-4213-879f-37b04ae3abe9-kube-api-access-zh5qw\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:02.966799 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.966702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:02.966799 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.966742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-tls\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:02.966799 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:02.966768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.067152 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73cfb30c-6eb9-4213-879f-37b04ae3abe9-metrics-client-ca\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.067314 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-grpc-tls\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.067314 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5qw\" (UniqueName: \"kubernetes.io/projected/73cfb30c-6eb9-4213-879f-37b04ae3abe9-kube-api-access-zh5qw\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.067314 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.067477 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-tls\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.067477 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.067477 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.067618 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.068027 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.067975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73cfb30c-6eb9-4213-879f-37b04ae3abe9-metrics-client-ca\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.070345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.070316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.070481 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.070458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.070656 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.070609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.070748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.070656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.070748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.070668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-thanos-querier-tls\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.071159 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.071137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73cfb30c-6eb9-4213-879f-37b04ae3abe9-secret-grpc-tls\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.075019 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.075002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5qw\" (UniqueName: \"kubernetes.io/projected/73cfb30c-6eb9-4213-879f-37b04ae3abe9-kube-api-access-zh5qw\") pod \"thanos-querier-5c999d4f58-km8q5\" (UID: \"73cfb30c-6eb9-4213-879f-37b04ae3abe9\") " pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.235865 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.235782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:03.384173 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.384128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c999d4f58-km8q5"] Apr 22 19:26:03.389035 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:26:03.389004 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73cfb30c_6eb9_4213_879f_37b04ae3abe9.slice/crio-6ea145171de287590509b53254df6c9637afb5dbfa12769a2d3b7f971a5e354b WatchSource:0}: Error finding container 6ea145171de287590509b53254df6c9637afb5dbfa12769a2d3b7f971a5e354b: Status 404 returned error can't find the container with id 6ea145171de287590509b53254df6c9637afb5dbfa12769a2d3b7f971a5e354b Apr 22 19:26:03.687543 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:03.687503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" event={"ID":"73cfb30c-6eb9-4213-879f-37b04ae3abe9","Type":"ContainerStarted","Data":"6ea145171de287590509b53254df6c9637afb5dbfa12769a2d3b7f971a5e354b"} Apr 22 19:26:04.386670 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.386632 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6477979999-8mjqj"] Apr 22 19:26:04.390173 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.390149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.393734 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.393707 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-vbmfz\"" Apr 22 19:26:04.393877 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.393778 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:26:04.395346 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.395320 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 19:26:04.395446 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.395380 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 19:26:04.395520 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.395478 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7tfgcf7phbhia\"" Apr 22 19:26:04.395722 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.395704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 19:26:04.419313 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.419277 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6477979999-8mjqj"] Apr 22 19:26:04.482628 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.482556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-audit-log\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.482628 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.482608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-metrics-server-audit-profiles\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.482962 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.482674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-client-ca-bundle\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.482962 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.482725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.482962 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.482775 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-secret-metrics-server-client-certs\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.482962 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.482801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-secret-metrics-server-tls\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.482962 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.482835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbbp\" (UniqueName: \"kubernetes.io/projected/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-kube-api-access-jtbbp\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.584272 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.584231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-secret-metrics-server-client-certs\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.584466 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.584282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-secret-metrics-server-tls\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.584466 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.584321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbbp\" (UniqueName: \"kubernetes.io/projected/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-kube-api-access-jtbbp\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.584466 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.584396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-audit-log\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.584466 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.584421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-metrics-server-audit-profiles\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.584730 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.584479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-client-ca-bundle\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.584730 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.584512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.585877 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.584997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-audit-log\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.585877 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.585329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.585877 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.585814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-metrics-server-audit-profiles\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.588083 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.588033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-secret-metrics-server-tls\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.588892 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.588848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-secret-metrics-server-client-certs\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.588892 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.588874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-client-ca-bundle\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.594404 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.594379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbbp\" (UniqueName: \"kubernetes.io/projected/f7fe6d59-8c0a-41d8-b794-8912d6ef43e9-kube-api-access-jtbbp\") pod \"metrics-server-6477979999-8mjqj\" (UID: \"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9\") " pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.701782 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.701620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:04.845345 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:04.845296 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6477979999-8mjqj"] Apr 22 19:26:05.331232 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:26:05.331194 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7fe6d59_8c0a_41d8_b794_8912d6ef43e9.slice/crio-f1637601a48859e60bf1a1426aebb65711d0267173787098786e61ce32a453e7 WatchSource:0}: Error finding container f1637601a48859e60bf1a1426aebb65711d0267173787098786e61ce32a453e7: Status 404 returned error can't find the container with id f1637601a48859e60bf1a1426aebb65711d0267173787098786e61ce32a453e7 Apr 22 19:26:05.698140 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:05.698098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" event={"ID":"73cfb30c-6eb9-4213-879f-37b04ae3abe9","Type":"ContainerStarted","Data":"b8c674ab35ead53806e491baa197a00048355758123e185c2f3c04a7f25e16f7"} Apr 22 19:26:05.698140 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:05.698144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" event={"ID":"73cfb30c-6eb9-4213-879f-37b04ae3abe9","Type":"ContainerStarted","Data":"6d775bcc9d28279b843d74016c47937fb92e42d0b567b1caa1e407d45d911a38"} Apr 22 19:26:05.698373 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:05.698157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" event={"ID":"73cfb30c-6eb9-4213-879f-37b04ae3abe9","Type":"ContainerStarted","Data":"2ba7b130a88b210658115663531c140376a11bd4096c0e103c91b71d478065f0"} Apr 22 19:26:05.699351 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:05.699323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" event={"ID":"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9","Type":"ContainerStarted","Data":"f1637601a48859e60bf1a1426aebb65711d0267173787098786e61ce32a453e7"} Apr 22 19:26:06.062269 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.062238 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:26:06.067127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.067096 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.070392 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.070362 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:26:06.074435 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.074388 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:26:06.074564 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.074516 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:26:06.074753 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.074732 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:26:06.074842 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.074731 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:26:06.075114 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.074982 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:26:06.075529 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.075511 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mmwnr\"" Apr 22 19:26:06.076897 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.075577 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:26:06.076897 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.075632 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:26:06.076897 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.075709 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:26:06.076897 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.075862 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:26:06.076897 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.075967 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bs9991tkbktio\"" Apr 22 19:26:06.078943 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.078560 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:26:06.081012 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.080987 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:26:06.085713 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.085693 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:26:06.098994 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.098957 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099171 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099171 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-config-out\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099171 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099171 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099171 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099426 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099426 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmprn\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-kube-api-access-tmprn\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099426 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099426 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099426 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099657 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099657 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099657 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099657 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099657 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-config\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099657 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.099958 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.099704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-web-config\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.200844 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.200808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.200844 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.200852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.200887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.200914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-config\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.200948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.200979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-web-config\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-config-out\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201106 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201634 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmprn\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-kube-api-access-tmprn\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201634 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201634 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201634 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.201634 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.201235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.206225 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.202713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.206225 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.204944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-config-out\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.206225 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.205490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.206225 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.205777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.206225 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.206103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.206225 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.206214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.206618 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.206316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-config\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.206941 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.206897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.207712 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.207555 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.208143 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.208097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.208751 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.208700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-web-config\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.208983 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.208936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.209297 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.209272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.209841 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.209785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.210539 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.210517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.211082 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.211063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.211783 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.211764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.217993 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.217951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmprn\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-kube-api-access-tmprn\") pod \"prometheus-k8s-0\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:06.382553 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:06.382515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:13.802639 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:13.802586 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:26:13.803622 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:26:13.803593 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75d84c90_caec_4329_a0c7_f6b6536b1c07.slice/crio-92c6f69653219b239d183cf8e3a7939f983ad61d58bb98c96e0e8101ac87852d WatchSource:0}: Error finding container 92c6f69653219b239d183cf8e3a7939f983ad61d58bb98c96e0e8101ac87852d: Status 404 returned error can't find the container with id 92c6f69653219b239d183cf8e3a7939f983ad61d58bb98c96e0e8101ac87852d Apr 22 19:26:14.300006 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.299963 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d578bf96b-fdww5"] Apr 22 19:26:14.731463 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.731422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerStarted","Data":"92c6f69653219b239d183cf8e3a7939f983ad61d58bb98c96e0e8101ac87852d"} Apr 22 19:26:14.733119 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.733084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-swq4t" event={"ID":"fd57b9ed-3208-41fd-aab6-8c6d3078a852","Type":"ContainerStarted","Data":"512658d2fa90c99eff097a50445bfd87ca428bbe58c453fdc6815e6c55e5e043"} Apr 22 19:26:14.733579 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.733536 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-swq4t" Apr 22 19:26:14.735154 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.735112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" event={"ID":"f7fe6d59-8c0a-41d8-b794-8912d6ef43e9","Type":"ContainerStarted","Data":"7941cee51a376a43bc50833f3aa43669edfc911e8097b9d2c5f18353bb27a8bd"} Apr 22 19:26:14.738406 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.738380 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" event={"ID":"73cfb30c-6eb9-4213-879f-37b04ae3abe9","Type":"ContainerStarted","Data":"79115bf28f5efcad3923d6d9412702c43036b348964935df149fdd2a70b7384c"} Apr 22 19:26:14.738406 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.738411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" event={"ID":"73cfb30c-6eb9-4213-879f-37b04ae3abe9","Type":"ContainerStarted","Data":"92f5afb364d1a896a379544087844a64848015e27f36ef2c9fe0278fa5f75c8a"} Apr 22 19:26:14.738623 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.738425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" event={"ID":"73cfb30c-6eb9-4213-879f-37b04ae3abe9","Type":"ContainerStarted","Data":"6168dc233ebbca1fec192811858ad07e3899bd3c79dedc2774db5884935c3b35"} Apr 22 19:26:14.738795 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.738762 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:14.745047 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.745023 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" Apr 22 19:26:14.750162 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.750131 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-swq4t" Apr 22 19:26:14.755133 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.755087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-swq4t" podStartSLOduration=2.136357351 podStartE2EDuration="22.755074277s" podCreationTimestamp="2026-04-22 19:25:52 +0000 UTC" firstStartedPulling="2026-04-22 19:25:53.081788112 +0000 UTC m=+173.504453401" lastFinishedPulling="2026-04-22 19:26:13.700505042 +0000 UTC m=+194.123170327" observedRunningTime="2026-04-22 19:26:14.754190203 +0000 UTC m=+195.176855511" watchObservedRunningTime="2026-04-22 19:26:14.755074277 +0000 UTC m=+195.177739585" Apr 22 19:26:14.801091 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.801024 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5c999d4f58-km8q5" podStartSLOduration=2.536012633 podStartE2EDuration="12.801005509s" podCreationTimestamp="2026-04-22 19:26:02 +0000 UTC" firstStartedPulling="2026-04-22 19:26:03.392025763 +0000 UTC m=+183.814691055" lastFinishedPulling="2026-04-22 19:26:13.657018637 +0000 UTC m=+194.079683931" observedRunningTime="2026-04-22 19:26:14.80043517 +0000 UTC m=+195.223100476" watchObservedRunningTime="2026-04-22 19:26:14.801005509 +0000 UTC m=+195.223670818" Apr 22 19:26:14.826665 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:14.826590 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" podStartSLOduration=2.500831713 podStartE2EDuration="10.826569703s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:05.333745104 +0000 UTC m=+185.756410389" lastFinishedPulling="2026-04-22 19:26:13.65948308 +0000 UTC m=+194.082148379" observedRunningTime="2026-04-22 19:26:14.825235807 +0000 UTC m=+195.247901114" watchObservedRunningTime="2026-04-22 19:26:14.826569703 +0000 UTC m=+195.249235011" Apr 22 19:26:15.743655 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:15.743615 2576 generic.go:358] "Generic (PLEG): container finished" podID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" exitCode=0 Apr 22 19:26:15.743834 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:15.743721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerDied","Data":"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8"} Apr 22 19:26:19.760656 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:19.760619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerStarted","Data":"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198"} Apr 22 19:26:19.761107 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:19.760665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerStarted","Data":"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9"} Apr 22 19:26:19.761107 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:19.760677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerStarted","Data":"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96"} Apr 22 19:26:19.761107 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:19.760689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerStarted","Data":"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5"} Apr 22 19:26:20.767814 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:20.767774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerStarted","Data":"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01"} Apr 22 19:26:20.767814 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:20.767816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerStarted","Data":"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652"} Apr 22 19:26:20.800631 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:20.800570 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=9.479842855 podStartE2EDuration="14.800556114s" podCreationTimestamp="2026-04-22 19:26:06 +0000 UTC" firstStartedPulling="2026-04-22 19:26:13.806055868 +0000 UTC m=+194.228721170" lastFinishedPulling="2026-04-22 19:26:19.12676913 +0000 UTC m=+199.549434429" observedRunningTime="2026-04-22 19:26:20.799180094 +0000 UTC m=+201.221845437" watchObservedRunningTime="2026-04-22 19:26:20.800556114 +0000 UTC m=+201.223221482" Apr 22 19:26:21.383073 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:21.383031 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:24.702037 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:24.701998 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:24.702407 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:24.702050 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:39.326621 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.326555 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" podUID="130af29b-4cd5-410b-b95b-ed57b79c76d2" containerName="registry" containerID="cri-o://2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08" gracePeriod=30 Apr 22 19:26:39.562441 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.562418 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:26:39.607578 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.607543 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/130af29b-4cd5-410b-b95b-ed57b79c76d2-ca-trust-extracted\") pod \"130af29b-4cd5-410b-b95b-ed57b79c76d2\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " Apr 22 19:26:39.607578 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.607580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") pod \"130af29b-4cd5-410b-b95b-ed57b79c76d2\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " Apr 22 19:26:39.607773 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.607603 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49bcb\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-kube-api-access-49bcb\") pod \"130af29b-4cd5-410b-b95b-ed57b79c76d2\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " Apr 22 19:26:39.607834 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.607787 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-certificates\") pod \"130af29b-4cd5-410b-b95b-ed57b79c76d2\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " Apr 22 19:26:39.607884 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.607842 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-image-registry-private-configuration\") pod \"130af29b-4cd5-410b-b95b-ed57b79c76d2\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " Apr 22 19:26:39.607951 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.607905 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-installation-pull-secrets\") pod \"130af29b-4cd5-410b-b95b-ed57b79c76d2\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " Apr 22 19:26:39.608011 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.607975 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-bound-sa-token\") pod \"130af29b-4cd5-410b-b95b-ed57b79c76d2\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " Apr 22 19:26:39.608011 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.608002 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-trusted-ca\") pod \"130af29b-4cd5-410b-b95b-ed57b79c76d2\" (UID: \"130af29b-4cd5-410b-b95b-ed57b79c76d2\") " Apr 22 19:26:39.608255 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.608215 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "130af29b-4cd5-410b-b95b-ed57b79c76d2" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:39.608683 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.608649 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "130af29b-4cd5-410b-b95b-ed57b79c76d2" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:39.610943 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.610893 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "130af29b-4cd5-410b-b95b-ed57b79c76d2" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:39.611053 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.610978 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-kube-api-access-49bcb" (OuterVolumeSpecName: "kube-api-access-49bcb") pod "130af29b-4cd5-410b-b95b-ed57b79c76d2" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2"). InnerVolumeSpecName "kube-api-access-49bcb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:39.611053 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.611028 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "130af29b-4cd5-410b-b95b-ed57b79c76d2" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:39.611053 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.611036 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "130af29b-4cd5-410b-b95b-ed57b79c76d2" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:39.611188 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.611098 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "130af29b-4cd5-410b-b95b-ed57b79c76d2" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:39.615989 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.615959 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/130af29b-4cd5-410b-b95b-ed57b79c76d2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "130af29b-4cd5-410b-b95b-ed57b79c76d2" (UID: "130af29b-4cd5-410b-b95b-ed57b79c76d2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:26:39.709143 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.709113 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/130af29b-4cd5-410b-b95b-ed57b79c76d2-ca-trust-extracted\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.709143 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.709137 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.709143 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.709148 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49bcb\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-kube-api-access-49bcb\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.709362 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.709157 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-registry-certificates\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.709362 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.709167 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-image-registry-private-configuration\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.709362 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.709177 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/130af29b-4cd5-410b-b95b-ed57b79c76d2-installation-pull-secrets\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.709362 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.709186 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/130af29b-4cd5-410b-b95b-ed57b79c76d2-bound-sa-token\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.709362 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.709195 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/130af29b-4cd5-410b-b95b-ed57b79c76d2-trusted-ca\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.830206 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.830170 2576 generic.go:358] "Generic (PLEG): container finished" podID="130af29b-4cd5-410b-b95b-ed57b79c76d2" containerID="2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08" exitCode=0 Apr 22 19:26:39.830371 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.830229 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" Apr 22 19:26:39.830371 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.830260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" event={"ID":"130af29b-4cd5-410b-b95b-ed57b79c76d2","Type":"ContainerDied","Data":"2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08"} Apr 22 19:26:39.830371 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.830299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d578bf96b-fdww5" event={"ID":"130af29b-4cd5-410b-b95b-ed57b79c76d2","Type":"ContainerDied","Data":"ac1bf49c21c695f453d40195eaa3e9320a130cd9e2b5a5347424c0da9f08a4d3"} Apr 22 19:26:39.830371 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.830315 2576 scope.go:117] "RemoveContainer" containerID="2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08" Apr 22 19:26:39.838577 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.838562 2576 scope.go:117] "RemoveContainer" containerID="2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08" Apr 22 19:26:39.838799 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:26:39.838784 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08\": container with ID starting with 2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08 not found: ID does not exist" containerID="2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08" Apr 22 19:26:39.838842 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.838805 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08"} err="failed to get container status \"2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08\": rpc error: code = NotFound desc = could not find container \"2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08\": container with ID starting with 2b763c0125ae71784a1a1c2fbecc1664ad5e8904aee0a122aa092263005aab08 not found: ID does not exist" Apr 22 19:26:39.851023 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.850999 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d578bf96b-fdww5"] Apr 22 19:26:39.854862 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:39.854843 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d578bf96b-fdww5"] Apr 22 19:26:40.155025 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:40.154991 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130af29b-4cd5-410b-b95b-ed57b79c76d2" path="/var/lib/kubelet/pods/130af29b-4cd5-410b-b95b-ed57b79c76d2/volumes" Apr 22 19:26:44.706790 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:44.706762 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:44.710857 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:44.710829 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6477979999-8mjqj" Apr 22 19:26:48.857130 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:48.857096 2576 generic.go:358] "Generic (PLEG): container finished" podID="44c0d1dd-5d1c-443a-a71d-40d163d60028" containerID="9952b503e626700b8648b3e99f86c95cfe7c93ee36da65136ed136d2ade0ca28" exitCode=0 Apr 22 19:26:48.857540 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:48.857175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" event={"ID":"44c0d1dd-5d1c-443a-a71d-40d163d60028","Type":"ContainerDied","Data":"9952b503e626700b8648b3e99f86c95cfe7c93ee36da65136ed136d2ade0ca28"} Apr 22 19:26:48.857591 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:48.857577 2576 scope.go:117] "RemoveContainer" containerID="9952b503e626700b8648b3e99f86c95cfe7c93ee36da65136ed136d2ade0ca28" Apr 22 19:26:49.863150 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:26:49.863105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9776m" event={"ID":"44c0d1dd-5d1c-443a-a71d-40d163d60028","Type":"ContainerStarted","Data":"34138c262fb00f2f4f89227510780dacfeec14a0affea0a4fd7a22919503d0a8"} Apr 22 19:27:06.383592 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:06.383554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:06.402613 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:06.402584 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:06.932335 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:06.932304 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:11.997935 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:11.997874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:27:12.000421 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:12.000385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/957d9773-bf39-486e-a32e-eba60e7b49e9-metrics-certs\") pod \"network-metrics-daemon-m8fmk\" (UID: \"957d9773-bf39-486e-a32e-eba60e7b49e9\") " pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:27:12.056249 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:12.056218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kmhxv\"" Apr 22 19:27:12.064550 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:12.064528 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8fmk" Apr 22 19:27:12.183398 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:12.183362 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m8fmk"] Apr 22 19:27:12.186523 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:27:12.186494 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod957d9773_bf39_486e_a32e_eba60e7b49e9.slice/crio-e4506ff674c68464ba91fd852aa83ad826d9740b7122ae2cdd4b40eaea89b292 WatchSource:0}: Error finding container e4506ff674c68464ba91fd852aa83ad826d9740b7122ae2cdd4b40eaea89b292: Status 404 returned error can't find the container with id e4506ff674c68464ba91fd852aa83ad826d9740b7122ae2cdd4b40eaea89b292 Apr 22 19:27:12.936990 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:12.936949 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m8fmk" event={"ID":"957d9773-bf39-486e-a32e-eba60e7b49e9","Type":"ContainerStarted","Data":"e4506ff674c68464ba91fd852aa83ad826d9740b7122ae2cdd4b40eaea89b292"} Apr 22 19:27:13.941149 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:13.941115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m8fmk" event={"ID":"957d9773-bf39-486e-a32e-eba60e7b49e9","Type":"ContainerStarted","Data":"6b012e16254bf34d4d7e9a296101d90fa1235420a7f27c7cb950e971d46c3abd"} Apr 22 19:27:13.941149 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:13.941151 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m8fmk" event={"ID":"957d9773-bf39-486e-a32e-eba60e7b49e9","Type":"ContainerStarted","Data":"ff9fb5a7601764e5426029d7e7bf0850bf2bb628272d0c7b3e249c396e500fc4"} Apr 22 19:27:13.965034 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:13.964991 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-m8fmk" podStartSLOduration=252.856929296 podStartE2EDuration="4m13.964977394s" podCreationTimestamp="2026-04-22 19:23:00 +0000 UTC" firstStartedPulling="2026-04-22 19:27:12.18880628 +0000 UTC m=+252.611471566" lastFinishedPulling="2026-04-22 19:27:13.296854379 +0000 UTC m=+253.719519664" observedRunningTime="2026-04-22 19:27:13.963062132 +0000 UTC m=+254.385727438" watchObservedRunningTime="2026-04-22 19:27:13.964977394 +0000 UTC m=+254.387642701" Apr 22 19:27:24.468369 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.468285 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:24.468811 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.468753 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="prometheus" containerID="cri-o://5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" gracePeriod=600 Apr 22 19:27:24.468903 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.468800 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="config-reloader" containerID="cri-o://9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" gracePeriod=600 Apr 22 19:27:24.468903 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.468828 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy-web" containerID="cri-o://0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" gracePeriod=600 Apr 22 19:27:24.469042 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.468810 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy-thanos" containerID="cri-o://01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" gracePeriod=600 Apr 22 19:27:24.469042 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.469010 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="thanos-sidecar" containerID="cri-o://28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" gracePeriod=600 Apr 22 19:27:24.469172 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.469032 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy" containerID="cri-o://f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" gracePeriod=600 Apr 22 19:27:24.696780 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.696756 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:24.803695 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.803617 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-serving-certs-ca-bundle\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.803695 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.803655 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-kube-rbac-proxy\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.803695 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.803678 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-grpc-tls\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.803958 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.803726 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmprn\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-kube-api-access-tmprn\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.803958 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.803900 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804077 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.803970 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-tls-assets\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804077 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804003 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804077 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-db\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804077 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804064 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-thanos-prometheus-http-client-file\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804267 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804123 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-kubelet-serving-ca-bundle\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804267 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804151 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-web-config\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804267 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804179 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-config\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804267 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804207 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-rulefiles-0\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804267 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804251 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:24.804515 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804263 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-config-out\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804515 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804329 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-tls\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804515 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804468 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-metrics-client-certs\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804515 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804506 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-trusted-ca-bundle\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804695 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804532 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-metrics-client-ca\") pod \"75d84c90-caec-4329-a0c7-f6b6536b1c07\" (UID: \"75d84c90-caec-4329-a0c7-f6b6536b1c07\") " Apr 22 19:27:24.804695 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804546 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:24.805040 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804825 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.805040 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.804849 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.805218 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.805196 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:24.806025 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.805713 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:27:24.806832 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.806629 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.807245 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.807054 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:24.807798 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.807774 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-config-out" (OuterVolumeSpecName: "config-out") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:27:24.807798 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.807785 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.807976 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.807795 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-kube-api-access-tmprn" (OuterVolumeSpecName: "kube-api-access-tmprn") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "kube-api-access-tmprn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:24.807976 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.807779 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:24.808117 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.808081 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.808117 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.808094 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.809276 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.809245 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.809404 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.809282 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.809474 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.809459 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:24.809598 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.809584 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-config" (OuterVolumeSpecName: "config") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.809687 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.809671 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.822401 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.822378 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-web-config" (OuterVolumeSpecName: "web-config") pod "75d84c90-caec-4329-a0c7-f6b6536b1c07" (UID: "75d84c90-caec-4329-a0c7-f6b6536b1c07"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:24.906045 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906006 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-config-out\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906045 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906039 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906045 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906052 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-metrics-client-certs\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906067 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906081 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-configmap-metrics-client-ca\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906093 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-kube-rbac-proxy\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906105 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-grpc-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906117 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmprn\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-kube-api-access-tmprn\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906129 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906142 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75d84c90-caec-4329-a0c7-f6b6536b1c07-tls-assets\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906153 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906164 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-db\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906179 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906193 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-web-config\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906204 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/75d84c90-caec-4329-a0c7-f6b6536b1c07-config\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.906308 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.906216 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75d84c90-caec-4329-a0c7-f6b6536b1c07-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:27:24.982647 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982608 2576 generic.go:358] "Generic (PLEG): container finished" podID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" exitCode=0 Apr 22 19:27:24.982647 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982636 2576 generic.go:358] "Generic (PLEG): container finished" podID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" exitCode=0 Apr 22 19:27:24.982647 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982645 2576 generic.go:358] "Generic (PLEG): container finished" podID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" exitCode=0 Apr 22 19:27:24.982647 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982653 2576 generic.go:358] "Generic (PLEG): container finished" podID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" exitCode=0 Apr 22 19:27:24.982956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982663 2576 generic.go:358] "Generic (PLEG): container finished" podID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" exitCode=0 Apr 22 19:27:24.982956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982670 2576 generic.go:358] "Generic (PLEG): container finished" podID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" exitCode=0 Apr 22 19:27:24.982956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerDied","Data":"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01"} Apr 22 19:27:24.982956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982709 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:24.982956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982724 2576 scope.go:117] "RemoveContainer" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" Apr 22 19:27:24.982956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerDied","Data":"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652"} Apr 22 19:27:24.982956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982904 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerDied","Data":"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198"} Apr 22 19:27:24.982956 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerDied","Data":"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9"} Apr 22 19:27:24.983240 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerDied","Data":"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96"} Apr 22 19:27:24.983240 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerDied","Data":"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5"} Apr 22 19:27:24.983240 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.982997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75d84c90-caec-4329-a0c7-f6b6536b1c07","Type":"ContainerDied","Data":"92c6f69653219b239d183cf8e3a7939f983ad61d58bb98c96e0e8101ac87852d"} Apr 22 19:27:24.990476 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.990457 2576 scope.go:117] "RemoveContainer" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" Apr 22 19:27:24.997424 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:24.997407 2576 scope.go:117] "RemoveContainer" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" Apr 22 19:27:25.003752 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.003734 2576 scope.go:117] "RemoveContainer" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" Apr 22 19:27:25.009169 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.009147 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:25.010343 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.010326 2576 scope.go:117] "RemoveContainer" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" Apr 22 19:27:25.015147 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.015126 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:25.017126 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.017096 2576 scope.go:117] "RemoveContainer" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" Apr 22 19:27:25.023785 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.023769 2576 scope.go:117] "RemoveContainer" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" Apr 22 19:27:25.030136 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.030120 2576 scope.go:117] "RemoveContainer" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" Apr 22 19:27:25.030385 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:27:25.030370 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": container with ID starting with 01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01 not found: ID does not exist" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" Apr 22 19:27:25.030432 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.030393 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01"} err="failed to get container status \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": rpc error: code = NotFound desc = could not find container \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": container with ID starting with 01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01 not found: ID does not exist" Apr 22 19:27:25.030432 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.030412 2576 scope.go:117] "RemoveContainer" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" Apr 22 19:27:25.030648 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:27:25.030633 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": container with ID starting with f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652 not found: ID does not exist" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" Apr 22 19:27:25.030702 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.030650 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652"} err="failed to get container status \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": rpc error: code = NotFound desc = could not find container \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": container with ID starting with f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652 not found: ID does not exist" Apr 22 19:27:25.030702 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.030665 2576 scope.go:117] "RemoveContainer" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" Apr 22 19:27:25.030881 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:27:25.030866 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": container with ID starting with 0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198 not found: ID does not exist" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" Apr 22 19:27:25.030937 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.030885 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198"} err="failed to get container status \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": rpc error: code = NotFound desc = could not find container \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": container with ID starting with 0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198 not found: ID does not exist" Apr 22 19:27:25.030937 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.030897 2576 scope.go:117] "RemoveContainer" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" Apr 22 19:27:25.031141 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:27:25.031122 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": container with ID starting with 28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9 not found: ID does not exist" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" Apr 22 19:27:25.031186 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.031146 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9"} err="failed to get container status \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": rpc error: code = NotFound desc = could not find container \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": container with ID starting with 28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9 not found: ID does not exist" Apr 22 19:27:25.031186 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.031163 2576 scope.go:117] "RemoveContainer" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" Apr 22 19:27:25.031379 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:27:25.031362 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": container with ID starting with 9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96 not found: ID does not exist" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" Apr 22 19:27:25.031422 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.031380 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96"} err="failed to get container status \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": rpc error: code = NotFound desc = could not find container \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": container with ID starting with 9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96 not found: ID does not exist" Apr 22 19:27:25.031422 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.031406 2576 scope.go:117] "RemoveContainer" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" Apr 22 19:27:25.031598 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:27:25.031580 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": container with ID starting with 5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5 not found: ID does not exist" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" Apr 22 19:27:25.031640 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.031603 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5"} err="failed to get container status \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": rpc error: code = NotFound desc = could not find container \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": container with ID starting with 5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5 not found: ID does not exist" Apr 22 19:27:25.031640 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.031618 2576 scope.go:117] "RemoveContainer" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" Apr 22 19:27:25.031826 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:27:25.031811 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": container with ID starting with 6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8 not found: ID does not exist" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" Apr 22 19:27:25.031865 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.031828 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8"} err="failed to get container status \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": rpc error: code = NotFound desc = could not find container \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": container with ID starting with 6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8 not found: ID does not exist" Apr 22 19:27:25.031865 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.031840 2576 scope.go:117] "RemoveContainer" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" Apr 22 19:27:25.032033 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032016 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01"} err="failed to get container status \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": rpc error: code = NotFound desc = could not find container \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": container with ID starting with 01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01 not found: ID does not exist" Apr 22 19:27:25.032089 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032034 2576 scope.go:117] "RemoveContainer" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" Apr 22 19:27:25.032241 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032226 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652"} err="failed to get container status \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": rpc error: code = NotFound desc = could not find container \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": container with ID starting with f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652 not found: ID does not exist" Apr 22 19:27:25.032286 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032242 2576 scope.go:117] "RemoveContainer" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" Apr 22 19:27:25.032468 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032451 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198"} err="failed to get container status \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": rpc error: code = NotFound desc = could not find container \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": container with ID starting with 0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198 not found: ID does not exist" Apr 22 19:27:25.032468 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032467 2576 scope.go:117] "RemoveContainer" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" Apr 22 19:27:25.032660 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032638 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9"} err="failed to get container status \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": rpc error: code = NotFound desc = could not find container \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": container with ID starting with 28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9 not found: ID does not exist" Apr 22 19:27:25.032727 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032662 2576 scope.go:117] "RemoveContainer" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" Apr 22 19:27:25.032871 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032853 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96"} err="failed to get container status \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": rpc error: code = NotFound desc = could not find container \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": container with ID starting with 9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96 not found: ID does not exist" Apr 22 19:27:25.033021 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.032872 2576 scope.go:117] "RemoveContainer" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" Apr 22 19:27:25.033097 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033081 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5"} err="failed to get container status \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": rpc error: code = NotFound desc = could not find container \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": container with ID starting with 5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5 not found: ID does not exist" Apr 22 19:27:25.033144 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033098 2576 scope.go:117] "RemoveContainer" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" Apr 22 19:27:25.033288 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033265 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8"} err="failed to get container status \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": rpc error: code = NotFound desc = could not find container \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": container with ID starting with 6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8 not found: ID does not exist" Apr 22 19:27:25.033330 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033289 2576 scope.go:117] "RemoveContainer" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" Apr 22 19:27:25.033473 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033455 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01"} err="failed to get container status \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": rpc error: code = NotFound desc = could not find container \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": container with ID starting with 01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01 not found: ID does not exist" Apr 22 19:27:25.033473 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033471 2576 scope.go:117] "RemoveContainer" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" Apr 22 19:27:25.033642 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033621 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652"} err="failed to get container status \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": rpc error: code = NotFound desc = could not find container \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": container with ID starting with f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652 not found: ID does not exist" Apr 22 19:27:25.033683 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033643 2576 scope.go:117] "RemoveContainer" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" Apr 22 19:27:25.033854 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033837 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198"} err="failed to get container status \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": rpc error: code = NotFound desc = could not find container \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": container with ID starting with 0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198 not found: ID does not exist" Apr 22 19:27:25.033891 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.033854 2576 scope.go:117] "RemoveContainer" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" Apr 22 19:27:25.034307 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.034228 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9"} err="failed to get container status \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": rpc error: code = NotFound desc = could not find container \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": container with ID starting with 28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9 not found: ID does not exist" Apr 22 19:27:25.034307 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.034255 2576 scope.go:117] "RemoveContainer" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" Apr 22 19:27:25.034557 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.034486 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96"} err="failed to get container status \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": rpc error: code = NotFound desc = could not find container \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": container with ID starting with 9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96 not found: ID does not exist" Apr 22 19:27:25.034557 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.034506 2576 scope.go:117] "RemoveContainer" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" Apr 22 19:27:25.034811 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.034736 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5"} err="failed to get container status \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": rpc error: code = NotFound desc = could not find container \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": container with ID starting with 5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5 not found: ID does not exist" Apr 22 19:27:25.034811 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.034756 2576 scope.go:117] "RemoveContainer" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" Apr 22 19:27:25.035129 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.035045 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8"} err="failed to get container status \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": rpc error: code = NotFound desc = could not find container \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": container with ID starting with 6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8 not found: ID does not exist" Apr 22 19:27:25.035129 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.035070 2576 scope.go:117] "RemoveContainer" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" Apr 22 19:27:25.035492 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.035417 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01"} err="failed to get container status \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": rpc error: code = NotFound desc = could not find container \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": container with ID starting with 01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01 not found: ID does not exist" Apr 22 19:27:25.035492 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.035438 2576 scope.go:117] "RemoveContainer" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" Apr 22 19:27:25.035739 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.035665 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652"} err="failed to get container status \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": rpc error: code = NotFound desc = could not find container \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": container with ID starting with f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652 not found: ID does not exist" Apr 22 19:27:25.035739 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.035685 2576 scope.go:117] "RemoveContainer" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" Apr 22 19:27:25.036042 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.035964 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198"} err="failed to get container status \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": rpc error: code = NotFound desc = could not find container \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": container with ID starting with 0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198 not found: ID does not exist" Apr 22 19:27:25.036042 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.035986 2576 scope.go:117] "RemoveContainer" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" Apr 22 19:27:25.036311 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.036233 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9"} err="failed to get container status \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": rpc error: code = NotFound desc = could not find container \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": container with ID starting with 28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9 not found: ID does not exist" Apr 22 19:27:25.036311 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.036253 2576 scope.go:117] "RemoveContainer" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" Apr 22 19:27:25.036635 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.036588 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96"} err="failed to get container status \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": rpc error: code = NotFound desc = could not find container \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": container with ID starting with 9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96 not found: ID does not exist" Apr 22 19:27:25.036635 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.036609 2576 scope.go:117] "RemoveContainer" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" Apr 22 19:27:25.036938 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.036900 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5"} err="failed to get container status \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": rpc error: code = NotFound desc = could not find container \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": container with ID starting with 5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5 not found: ID does not exist" Apr 22 19:27:25.037007 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.036940 2576 scope.go:117] "RemoveContainer" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" Apr 22 19:27:25.037226 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.037203 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8"} err="failed to get container status \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": rpc error: code = NotFound desc = could not find container \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": container with ID starting with 6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8 not found: ID does not exist" Apr 22 19:27:25.037290 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.037228 2576 scope.go:117] "RemoveContainer" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" Apr 22 19:27:25.037813 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.037774 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01"} err="failed to get container status \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": rpc error: code = NotFound desc = could not find container \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": container with ID starting with 01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01 not found: ID does not exist" Apr 22 19:27:25.037813 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.037794 2576 scope.go:117] "RemoveContainer" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" Apr 22 19:27:25.038084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.038033 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652"} err="failed to get container status \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": rpc error: code = NotFound desc = could not find container \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": container with ID starting with f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652 not found: ID does not exist" Apr 22 19:27:25.038084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.038050 2576 scope.go:117] "RemoveContainer" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" Apr 22 19:27:25.038276 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.038253 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198"} err="failed to get container status \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": rpc error: code = NotFound desc = could not find container \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": container with ID starting with 0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198 not found: ID does not exist" Apr 22 19:27:25.038351 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.038278 2576 scope.go:117] "RemoveContainer" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" Apr 22 19:27:25.038550 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.038506 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9"} err="failed to get container status \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": rpc error: code = NotFound desc = could not find container \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": container with ID starting with 28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9 not found: ID does not exist" Apr 22 19:27:25.038550 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.038532 2576 scope.go:117] "RemoveContainer" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" Apr 22 19:27:25.038803 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.038750 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96"} err="failed to get container status \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": rpc error: code = NotFound desc = could not find container \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": container with ID starting with 9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96 not found: ID does not exist" Apr 22 19:27:25.038803 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.038780 2576 scope.go:117] "RemoveContainer" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" Apr 22 19:27:25.039122 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.039074 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5"} err="failed to get container status \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": rpc error: code = NotFound desc = could not find container \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": container with ID starting with 5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5 not found: ID does not exist" Apr 22 19:27:25.039206 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.039125 2576 scope.go:117] "RemoveContainer" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" Apr 22 19:27:25.039410 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.039390 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8"} err="failed to get container status \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": rpc error: code = NotFound desc = could not find container \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": container with ID starting with 6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8 not found: ID does not exist" Apr 22 19:27:25.039490 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.039411 2576 scope.go:117] "RemoveContainer" containerID="01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01" Apr 22 19:27:25.039651 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.039630 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01"} err="failed to get container status \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": rpc error: code = NotFound desc = could not find container \"01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01\": container with ID starting with 01464d7b391ca8c431fe72ee9ec5d65917a2deb417d82e3bef7ede200157ce01 not found: ID does not exist" Apr 22 19:27:25.039699 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.039655 2576 scope.go:117] "RemoveContainer" containerID="f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652" Apr 22 19:27:25.039902 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.039883 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652"} err="failed to get container status \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": rpc error: code = NotFound desc = could not find container \"f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652\": container with ID starting with f0b29bb3ab0af4f914e258e6aad81b2138ba6f73b707ceee265d85f8d1eb9652 not found: ID does not exist" Apr 22 19:27:25.039978 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.039903 2576 scope.go:117] "RemoveContainer" containerID="0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198" Apr 22 19:27:25.040139 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040123 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198"} err="failed to get container status \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": rpc error: code = NotFound desc = could not find container \"0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198\": container with ID starting with 0994f9aa07880112ff1e13f6735f96bcb30b94eec8193efd613a98db7bac0198 not found: ID does not exist" Apr 22 19:27:25.040197 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040138 2576 scope.go:117] "RemoveContainer" containerID="28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9" Apr 22 19:27:25.040360 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040336 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9"} err="failed to get container status \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": rpc error: code = NotFound desc = could not find container \"28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9\": container with ID starting with 28304c499cb65fcac77740821ead2d2d19ae1a8c6c85d358306ded5fe16141c9 not found: ID does not exist" Apr 22 19:27:25.040360 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040361 2576 scope.go:117] "RemoveContainer" containerID="9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96" Apr 22 19:27:25.040613 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040597 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96"} err="failed to get container status \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": rpc error: code = NotFound desc = could not find container \"9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96\": container with ID starting with 9738c6e33554c8cf41c54d0128ec1274fb1df15acc844c96aa47fe5ecc045b96 not found: ID does not exist" Apr 22 19:27:25.040685 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040614 2576 scope.go:117] "RemoveContainer" containerID="5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5" Apr 22 19:27:25.040685 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040650 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:25.040851 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040829 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5"} err="failed to get container status \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": rpc error: code = NotFound desc = could not find container \"5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5\": container with ID starting with 5ee91e6ae142cbcbe7111ec3c277328b78aa1f6be6f743900d7f6d947e7c46f5 not found: ID does not exist" Apr 22 19:27:25.040899 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.040853 2576 scope.go:117] "RemoveContainer" containerID="6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8" Apr 22 19:27:25.041039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041023 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="prometheus" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041048 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="prometheus" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041060 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="init-config-reloader" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041066 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="init-config-reloader" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041073 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy-thanos" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041078 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy-thanos" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041092 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="130af29b-4cd5-410b-b95b-ed57b79c76d2" containerName="registry" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041098 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="130af29b-4cd5-410b-b95b-ed57b79c76d2" containerName="registry" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041104 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy-web" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041101 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8"} err="failed to get container status \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": rpc error: code = NotFound desc = could not find container \"6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8\": container with ID starting with 6880ceceaea4c1bd2d363868c67302dee558e434971451971a9ff722f9362dd8 not found: ID does not exist" Apr 22 19:27:25.041127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041110 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy-web" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041133 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041142 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041153 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="config-reloader" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041159 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="config-reloader" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041165 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="thanos-sidecar" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041170 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="thanos-sidecar" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041234 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041246 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="thanos-sidecar" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041252 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy-thanos" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041259 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="config-reloader" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041265 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="130af29b-4cd5-410b-b95b-ed57b79c76d2" containerName="registry" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041271 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="kube-rbac-proxy-web" Apr 22 19:27:25.041443 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.041277 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" containerName="prometheus" Apr 22 19:27:25.046730 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.046715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.052431 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.052403 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:27:25.052623 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.052606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:27:25.053100 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.052996 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bs9991tkbktio\"" Apr 22 19:27:25.053222 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.053163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:27:25.053289 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.053219 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:27:25.053289 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.053266 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:27:25.053460 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.053444 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:27:25.053526 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.053451 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:27:25.054156 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.054102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mmwnr\"" Apr 22 19:27:25.054156 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.054120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:27:25.054265 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.054174 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:27:25.054671 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.054650 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:27:25.058901 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.058755 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:27:25.062358 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.061965 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:27:25.065890 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.065870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:25.208242 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208452 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dc35bc67-b5e8-4eda-b2dc-068614394573-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208452 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208452 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208452 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvhc5\" (UniqueName: \"kubernetes.io/projected/dc35bc67-b5e8-4eda-b2dc-068614394573-kube-api-access-mvhc5\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.208718 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dc35bc67-b5e8-4eda-b2dc-068614394573-config-out\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.209083 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.209083 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-config\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.209083 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-web-config\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.209083 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.208880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310288 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvhc5\" (UniqueName: \"kubernetes.io/projected/dc35bc67-b5e8-4eda-b2dc-068614394573-kube-api-access-mvhc5\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310288 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310288 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310535 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310535 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310535 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310535 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dc35bc67-b5e8-4eda-b2dc-068614394573-config-out\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310535 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310777 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-config\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310777 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-web-config\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310777 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.310777 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311007 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dc35bc67-b5e8-4eda-b2dc-068614394573-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311007 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311007 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311007 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311007 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311007 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311007 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.310997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311363 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.311188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.311774 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.311701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.313082 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.312635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.313724 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.313665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.314742 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.314388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.314742 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.314483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.314742 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.314682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dc35bc67-b5e8-4eda-b2dc-068614394573-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.314742 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.314690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dc35bc67-b5e8-4eda-b2dc-068614394573-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.315133 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.315110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dc35bc67-b5e8-4eda-b2dc-068614394573-config-out\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.315216 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.315196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.315583 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.315562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-web-config\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.315658 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.315599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.315806 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.315788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.315872 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.315823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-config\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.316022 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.316005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.316458 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.316442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dc35bc67-b5e8-4eda-b2dc-068614394573-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.320452 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.320432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvhc5\" (UniqueName: \"kubernetes.io/projected/dc35bc67-b5e8-4eda-b2dc-068614394573-kube-api-access-mvhc5\") pod \"prometheus-k8s-0\" (UID: \"dc35bc67-b5e8-4eda-b2dc-068614394573\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.356238 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.356192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:25.493383 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.493289 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:25.495810 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:27:25.495772 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc35bc67_b5e8_4eda_b2dc_068614394573.slice/crio-e5d3820d9da6eae4a35b4f146a7d42fc8be843bf25b31a8e52449f7cadedebfd WatchSource:0}: Error finding container e5d3820d9da6eae4a35b4f146a7d42fc8be843bf25b31a8e52449f7cadedebfd: Status 404 returned error can't find the container with id e5d3820d9da6eae4a35b4f146a7d42fc8be843bf25b31a8e52449f7cadedebfd Apr 22 19:27:25.987784 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.987751 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc35bc67-b5e8-4eda-b2dc-068614394573" containerID="db8cf87344a0af0d9dd1f2c9a5b44cb453d346100e1996efa3df36b93794f98c" exitCode=0 Apr 22 19:27:25.987957 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.987800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dc35bc67-b5e8-4eda-b2dc-068614394573","Type":"ContainerDied","Data":"db8cf87344a0af0d9dd1f2c9a5b44cb453d346100e1996efa3df36b93794f98c"} Apr 22 19:27:25.987957 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:25.987820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dc35bc67-b5e8-4eda-b2dc-068614394573","Type":"ContainerStarted","Data":"e5d3820d9da6eae4a35b4f146a7d42fc8be843bf25b31a8e52449f7cadedebfd"} Apr 22 19:27:26.156321 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:26.156287 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d84c90-caec-4329-a0c7-f6b6536b1c07" path="/var/lib/kubelet/pods/75d84c90-caec-4329-a0c7-f6b6536b1c07/volumes" Apr 22 19:27:26.994879 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:26.994840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dc35bc67-b5e8-4eda-b2dc-068614394573","Type":"ContainerStarted","Data":"4d42d2c8a9a03d39cdb66d47b2777e693bb4a0e7555e9cc9ac74f45bdca3b0d3"} Apr 22 19:27:26.994879 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:26.994882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dc35bc67-b5e8-4eda-b2dc-068614394573","Type":"ContainerStarted","Data":"8e6114e2c5c4aad22659a12878b1c713a385a002578084d1798eb4f9d7d1657c"} Apr 22 19:27:26.995301 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:26.994896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dc35bc67-b5e8-4eda-b2dc-068614394573","Type":"ContainerStarted","Data":"226ac41e3a60a5268f9f10fe4d15cb931045028bd473b7c7d76909aa08978565"} Apr 22 19:27:26.995301 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:26.994908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dc35bc67-b5e8-4eda-b2dc-068614394573","Type":"ContainerStarted","Data":"82e2305d26f2b9a9d494a94da8553a1516cde44ad654da79e24950d745e547d9"} Apr 22 19:27:26.995301 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:26.994941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dc35bc67-b5e8-4eda-b2dc-068614394573","Type":"ContainerStarted","Data":"8c93ac34b7155e0c16fe039ce85fb373d3fe08183d72fa072272134f0880b927"} Apr 22 19:27:26.995301 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:26.994954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dc35bc67-b5e8-4eda-b2dc-068614394573","Type":"ContainerStarted","Data":"070f280424475dddc9cb062f54339123a99da01d1da10a2edb584a31c48fcd03"} Apr 22 19:27:27.024781 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:27.024721 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.02470042 podStartE2EDuration="2.02470042s" podCreationTimestamp="2026-04-22 19:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:27.022409678 +0000 UTC m=+267.445074998" watchObservedRunningTime="2026-04-22 19:27:27.02470042 +0000 UTC m=+267.447365728" Apr 22 19:27:30.356545 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:30.356495 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:38.571694 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:27:38.571635 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" podUID="18fa20f4-e79f-4f01-9142-38e98b2350d6" Apr 22 19:27:39.030099 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:39.030068 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:27:42.562056 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:42.562011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:27:42.564370 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:42.564328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18fa20f4-e79f-4f01-9142-38e98b2350d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9f56t\" (UID: \"18fa20f4-e79f-4f01-9142-38e98b2350d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:27:42.633608 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:42.633577 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5ns48\"" Apr 22 19:27:42.641934 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:42.641902 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" Apr 22 19:27:42.759806 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:42.759784 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9f56t"] Apr 22 19:27:42.762541 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:27:42.762510 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18fa20f4_e79f_4f01_9142_38e98b2350d6.slice/crio-666f70987fa07d4060406550344cef6dac868d076fce267905097263400254a4 WatchSource:0}: Error finding container 666f70987fa07d4060406550344cef6dac868d076fce267905097263400254a4: Status 404 returned error can't find the container with id 666f70987fa07d4060406550344cef6dac868d076fce267905097263400254a4 Apr 22 19:27:43.042400 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:43.042363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" event={"ID":"18fa20f4-e79f-4f01-9142-38e98b2350d6","Type":"ContainerStarted","Data":"666f70987fa07d4060406550344cef6dac868d076fce267905097263400254a4"} Apr 22 19:27:44.047082 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:44.046993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" event={"ID":"18fa20f4-e79f-4f01-9142-38e98b2350d6","Type":"ContainerStarted","Data":"985ffc30aeda023ba32d9252df586435f59abe471e559fefab3ad262e33860ab"} Apr 22 19:27:44.064399 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:27:44.064348 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9f56t" podStartSLOduration=274.033628242 podStartE2EDuration="4m35.064333581s" podCreationTimestamp="2026-04-22 19:23:09 +0000 UTC" firstStartedPulling="2026-04-22 19:27:42.764368658 +0000 UTC m=+283.187033943" lastFinishedPulling="2026-04-22 19:27:43.795073979 +0000 UTC m=+284.217739282" observedRunningTime="2026-04-22 19:27:44.062978983 +0000 UTC m=+284.485644303" watchObservedRunningTime="2026-04-22 19:27:44.064333581 +0000 UTC m=+284.486998888" Apr 22 19:28:00.052429 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:28:00.052391 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:28:00.052901 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:28:00.052391 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:28:00.060416 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:28:00.060394 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:28:00.060536 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:28:00.060419 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:28:00.063788 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:28:00.063767 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:28:25.357375 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:28:25.357338 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:25.372460 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:28:25.372425 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:26.188095 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:28:26.188066 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:33:00.077434 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:00.077403 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:33:00.078614 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:00.078594 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:33:00.084868 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:00.084847 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:33:00.085226 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:00.085209 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:33:29.209869 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.209836 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-9rgth"] Apr 22 19:33:29.213019 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.213003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9rgth" Apr 22 19:33:29.215979 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.215947 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 19:33:29.215979 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.215947 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4455v\"" Apr 22 19:33:29.216157 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.215990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:33:29.217173 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.217150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:33:29.228959 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.228912 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-9rgth"] Apr 22 19:33:29.350202 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.350163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjj5\" (UniqueName: \"kubernetes.io/projected/b82d74ee-a068-4108-bba3-2a0af669f857-kube-api-access-fjjj5\") pod \"s3-init-9rgth\" (UID: \"b82d74ee-a068-4108-bba3-2a0af669f857\") " pod="kserve/s3-init-9rgth" Apr 22 19:33:29.451212 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.451173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjj5\" (UniqueName: \"kubernetes.io/projected/b82d74ee-a068-4108-bba3-2a0af669f857-kube-api-access-fjjj5\") pod \"s3-init-9rgth\" (UID: \"b82d74ee-a068-4108-bba3-2a0af669f857\") " pod="kserve/s3-init-9rgth" Apr 22 19:33:29.460411 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.460343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjj5\" (UniqueName: \"kubernetes.io/projected/b82d74ee-a068-4108-bba3-2a0af669f857-kube-api-access-fjjj5\") pod \"s3-init-9rgth\" (UID: \"b82d74ee-a068-4108-bba3-2a0af669f857\") " pod="kserve/s3-init-9rgth" Apr 22 19:33:29.535916 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.535873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9rgth" Apr 22 19:33:29.654487 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.654451 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-9rgth"] Apr 22 19:33:29.657574 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:33:29.657541 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82d74ee_a068_4108_bba3_2a0af669f857.slice/crio-7e3b38b2c6b3fb85a70aba3bb5c2ac31911496f2a45d22e1c3098b10f2dc22bc WatchSource:0}: Error finding container 7e3b38b2c6b3fb85a70aba3bb5c2ac31911496f2a45d22e1c3098b10f2dc22bc: Status 404 returned error can't find the container with id 7e3b38b2c6b3fb85a70aba3bb5c2ac31911496f2a45d22e1c3098b10f2dc22bc Apr 22 19:33:29.659242 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:29.659223 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:33:30.048978 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:30.048937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9rgth" event={"ID":"b82d74ee-a068-4108-bba3-2a0af669f857","Type":"ContainerStarted","Data":"7e3b38b2c6b3fb85a70aba3bb5c2ac31911496f2a45d22e1c3098b10f2dc22bc"} Apr 22 19:33:35.067771 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:35.067732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9rgth" event={"ID":"b82d74ee-a068-4108-bba3-2a0af669f857","Type":"ContainerStarted","Data":"900acedef3d76aedac336056ea8ed3a55ed37b73b496eb736f5464af43c29e30"} Apr 22 19:33:35.085631 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:35.085570 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-9rgth" podStartSLOduration=1.707401438 podStartE2EDuration="6.085551487s" podCreationTimestamp="2026-04-22 19:33:29 +0000 UTC" firstStartedPulling="2026-04-22 19:33:29.659375619 +0000 UTC m=+630.082040908" lastFinishedPulling="2026-04-22 19:33:34.037525672 +0000 UTC m=+634.460190957" observedRunningTime="2026-04-22 19:33:35.083948938 +0000 UTC m=+635.506614245" watchObservedRunningTime="2026-04-22 19:33:35.085551487 +0000 UTC m=+635.508216797" Apr 22 19:33:38.078187 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:38.078152 2576 generic.go:358] "Generic (PLEG): container finished" podID="b82d74ee-a068-4108-bba3-2a0af669f857" containerID="900acedef3d76aedac336056ea8ed3a55ed37b73b496eb736f5464af43c29e30" exitCode=0 Apr 22 19:33:38.078649 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:38.078235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9rgth" event={"ID":"b82d74ee-a068-4108-bba3-2a0af669f857","Type":"ContainerDied","Data":"900acedef3d76aedac336056ea8ed3a55ed37b73b496eb736f5464af43c29e30"} Apr 22 19:33:39.208507 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:39.208484 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9rgth" Apr 22 19:33:39.343085 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:39.342990 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjj5\" (UniqueName: \"kubernetes.io/projected/b82d74ee-a068-4108-bba3-2a0af669f857-kube-api-access-fjjj5\") pod \"b82d74ee-a068-4108-bba3-2a0af669f857\" (UID: \"b82d74ee-a068-4108-bba3-2a0af669f857\") " Apr 22 19:33:39.345196 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:39.345169 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82d74ee-a068-4108-bba3-2a0af669f857-kube-api-access-fjjj5" (OuterVolumeSpecName: "kube-api-access-fjjj5") pod "b82d74ee-a068-4108-bba3-2a0af669f857" (UID: "b82d74ee-a068-4108-bba3-2a0af669f857"). InnerVolumeSpecName "kube-api-access-fjjj5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:39.443729 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:39.443682 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjjj5\" (UniqueName: \"kubernetes.io/projected/b82d74ee-a068-4108-bba3-2a0af669f857-kube-api-access-fjjj5\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:33:40.085765 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:40.085730 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9rgth" Apr 22 19:33:40.085953 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:40.085733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9rgth" event={"ID":"b82d74ee-a068-4108-bba3-2a0af669f857","Type":"ContainerDied","Data":"7e3b38b2c6b3fb85a70aba3bb5c2ac31911496f2a45d22e1c3098b10f2dc22bc"} Apr 22 19:33:40.085953 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:33:40.085844 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e3b38b2c6b3fb85a70aba3bb5c2ac31911496f2a45d22e1c3098b10f2dc22bc" Apr 22 19:37:04.261491 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.261454 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k"] Apr 22 19:37:04.261942 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.261767 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b82d74ee-a068-4108-bba3-2a0af669f857" containerName="s3-init" Apr 22 19:37:04.261942 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.261777 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82d74ee-a068-4108-bba3-2a0af669f857" containerName="s3-init" Apr 22 19:37:04.261942 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.261842 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b82d74ee-a068-4108-bba3-2a0af669f857" containerName="s3-init" Apr 22 19:37:04.264645 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.264628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:04.267238 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.267213 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-8e08c-kube-rbac-proxy-sar-config\"" Apr 22 19:37:04.267366 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.267305 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:37:04.267366 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.267340 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-8e08c-serving-cert\"" Apr 22 19:37:04.267482 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.267411 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qbzcz\"" Apr 22 19:37:04.273518 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.273493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k"] Apr 22 19:37:04.323209 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.323168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c866a62f-d697-49e0-8a01-2b9eb787aa57-openshift-service-ca-bundle\") pod \"model-chainer-raw-8e08c-5f649888f5-k2t7k\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:04.323390 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.323233 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls\") pod \"model-chainer-raw-8e08c-5f649888f5-k2t7k\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:04.423993 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.423956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c866a62f-d697-49e0-8a01-2b9eb787aa57-openshift-service-ca-bundle\") pod \"model-chainer-raw-8e08c-5f649888f5-k2t7k\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:04.424159 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.424006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls\") pod \"model-chainer-raw-8e08c-5f649888f5-k2t7k\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:04.424159 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:37:04.424116 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-8e08c-serving-cert: secret "model-chainer-raw-8e08c-serving-cert" not found Apr 22 19:37:04.424230 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:37:04.424169 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls podName:c866a62f-d697-49e0-8a01-2b9eb787aa57 nodeName:}" failed. No retries permitted until 2026-04-22 19:37:04.92415406 +0000 UTC m=+845.346819345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls") pod "model-chainer-raw-8e08c-5f649888f5-k2t7k" (UID: "c866a62f-d697-49e0-8a01-2b9eb787aa57") : secret "model-chainer-raw-8e08c-serving-cert" not found Apr 22 19:37:04.424610 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.424590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c866a62f-d697-49e0-8a01-2b9eb787aa57-openshift-service-ca-bundle\") pod \"model-chainer-raw-8e08c-5f649888f5-k2t7k\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:04.928065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.928026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls\") pod \"model-chainer-raw-8e08c-5f649888f5-k2t7k\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:04.930564 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:04.930540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls\") pod \"model-chainer-raw-8e08c-5f649888f5-k2t7k\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:05.176204 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:05.176166 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:05.296310 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:05.296257 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k"] Apr 22 19:37:05.299817 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:37:05.299791 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc866a62f_d697_49e0_8a01_2b9eb787aa57.slice/crio-5d61a52e21883ce3eb1991c3a927730d2b4bc7688139a9753d0b7d47b30e704a WatchSource:0}: Error finding container 5d61a52e21883ce3eb1991c3a927730d2b4bc7688139a9753d0b7d47b30e704a: Status 404 returned error can't find the container with id 5d61a52e21883ce3eb1991c3a927730d2b4bc7688139a9753d0b7d47b30e704a Apr 22 19:37:05.689086 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:05.689050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" event={"ID":"c866a62f-d697-49e0-8a01-2b9eb787aa57","Type":"ContainerStarted","Data":"5d61a52e21883ce3eb1991c3a927730d2b4bc7688139a9753d0b7d47b30e704a"} Apr 22 19:37:08.700142 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:08.700102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" event={"ID":"c866a62f-d697-49e0-8a01-2b9eb787aa57","Type":"ContainerStarted","Data":"8a4afefff835d12dc364fbd49d3df7f98abc7a1af98e08be72dcd2782fa03409"} Apr 22 19:37:08.700670 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:08.700223 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:08.716135 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:08.716085 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podStartSLOduration=2.309546998 podStartE2EDuration="4.716053616s" podCreationTimestamp="2026-04-22 19:37:04 +0000 UTC" firstStartedPulling="2026-04-22 19:37:05.301608123 +0000 UTC m=+845.724273411" lastFinishedPulling="2026-04-22 19:37:07.70811474 +0000 UTC m=+848.130780029" observedRunningTime="2026-04-22 19:37:08.715612768 +0000 UTC m=+849.138278076" watchObservedRunningTime="2026-04-22 19:37:08.716053616 +0000 UTC m=+849.138718923" Apr 22 19:37:14.335469 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:14.335438 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k"] Apr 22 19:37:14.335859 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:14.335669 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" containerID="cri-o://8a4afefff835d12dc364fbd49d3df7f98abc7a1af98e08be72dcd2782fa03409" gracePeriod=30 Apr 22 19:37:14.340649 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:14.340618 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:37:19.339889 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:19.339847 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:37:24.339874 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:24.339833 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:37:29.339937 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:29.339884 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:37:34.339908 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:34.339867 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:37:39.340354 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:39.340317 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:37:44.340616 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:44.340570 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:37:44.814458 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:44.814423 2576 generic.go:358] "Generic (PLEG): container finished" podID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerID="8a4afefff835d12dc364fbd49d3df7f98abc7a1af98e08be72dcd2782fa03409" exitCode=0 Apr 22 19:37:44.814638 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:44.814501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" event={"ID":"c866a62f-d697-49e0-8a01-2b9eb787aa57","Type":"ContainerDied","Data":"8a4afefff835d12dc364fbd49d3df7f98abc7a1af98e08be72dcd2782fa03409"} Apr 22 19:37:44.971151 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:44.971125 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:45.062818 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.062788 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c866a62f-d697-49e0-8a01-2b9eb787aa57-openshift-service-ca-bundle\") pod \"c866a62f-d697-49e0-8a01-2b9eb787aa57\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " Apr 22 19:37:45.062818 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.062830 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls\") pod \"c866a62f-d697-49e0-8a01-2b9eb787aa57\" (UID: \"c866a62f-d697-49e0-8a01-2b9eb787aa57\") " Apr 22 19:37:45.063267 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.063241 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c866a62f-d697-49e0-8a01-2b9eb787aa57-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c866a62f-d697-49e0-8a01-2b9eb787aa57" (UID: "c866a62f-d697-49e0-8a01-2b9eb787aa57"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:37:45.065084 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.065043 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c866a62f-d697-49e0-8a01-2b9eb787aa57" (UID: "c866a62f-d697-49e0-8a01-2b9eb787aa57"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:37:45.164146 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.164115 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c866a62f-d697-49e0-8a01-2b9eb787aa57-openshift-service-ca-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:37:45.164146 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.164141 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c866a62f-d697-49e0-8a01-2b9eb787aa57-proxy-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:37:45.818196 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.818161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" event={"ID":"c866a62f-d697-49e0-8a01-2b9eb787aa57","Type":"ContainerDied","Data":"5d61a52e21883ce3eb1991c3a927730d2b4bc7688139a9753d0b7d47b30e704a"} Apr 22 19:37:45.818196 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.818188 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k" Apr 22 19:37:45.818703 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.818214 2576 scope.go:117] "RemoveContainer" containerID="8a4afefff835d12dc364fbd49d3df7f98abc7a1af98e08be72dcd2782fa03409" Apr 22 19:37:45.839390 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.839363 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k"] Apr 22 19:37:45.842905 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:45.842881 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-8e08c-5f649888f5-k2t7k"] Apr 22 19:37:46.160364 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:37:46.160330 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" path="/var/lib/kubelet/pods/c866a62f-d697-49e0-8a01-2b9eb787aa57/volumes" Apr 22 19:38:00.100058 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:00.100030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:38:00.102195 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:00.102155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:38:00.106769 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:00.106748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:38:00.108667 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:00.108649 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:38:54.595268 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.595237 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6"] Apr 22 19:38:54.597472 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.595555 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" Apr 22 19:38:54.597472 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.595565 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" Apr 22 19:38:54.597472 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.595611 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c866a62f-d697-49e0-8a01-2b9eb787aa57" containerName="model-chainer-raw-8e08c" Apr 22 19:38:54.598394 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.598379 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:54.601245 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.601209 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-ed925-serving-cert\"" Apr 22 19:38:54.601245 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.601239 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:38:54.601427 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.601209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-ed925-kube-rbac-proxy-sar-config\"" Apr 22 19:38:54.601427 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.601265 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qbzcz\"" Apr 22 19:38:54.606365 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.606346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6"] Apr 22 19:38:54.627018 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.626981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71db560a-0382-4d72-bcad-ab0680af5d0a-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-ed925-68b69d48f5-777g6\" (UID: \"71db560a-0382-4d72-bcad-ab0680af5d0a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:54.627151 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.627054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71db560a-0382-4d72-bcad-ab0680af5d0a-proxy-tls\") pod \"model-chainer-raw-hpa-ed925-68b69d48f5-777g6\" (UID: \"71db560a-0382-4d72-bcad-ab0680af5d0a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:54.727655 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.727626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71db560a-0382-4d72-bcad-ab0680af5d0a-proxy-tls\") pod \"model-chainer-raw-hpa-ed925-68b69d48f5-777g6\" (UID: \"71db560a-0382-4d72-bcad-ab0680af5d0a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:54.727835 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.727697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71db560a-0382-4d72-bcad-ab0680af5d0a-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-ed925-68b69d48f5-777g6\" (UID: \"71db560a-0382-4d72-bcad-ab0680af5d0a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:54.728303 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.728278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71db560a-0382-4d72-bcad-ab0680af5d0a-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-ed925-68b69d48f5-777g6\" (UID: \"71db560a-0382-4d72-bcad-ab0680af5d0a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:54.730065 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.730043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71db560a-0382-4d72-bcad-ab0680af5d0a-proxy-tls\") pod \"model-chainer-raw-hpa-ed925-68b69d48f5-777g6\" (UID: \"71db560a-0382-4d72-bcad-ab0680af5d0a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:54.909765 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:54.909715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:55.026680 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:55.026618 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6"] Apr 22 19:38:55.029275 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:38:55.029250 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71db560a_0382_4d72_bcad_ab0680af5d0a.slice/crio-aacb9783431cee347d5c68b0ce284cdc74bae1a3e3da11cc3729f0325e144e8c WatchSource:0}: Error finding container aacb9783431cee347d5c68b0ce284cdc74bae1a3e3da11cc3729f0325e144e8c: Status 404 returned error can't find the container with id aacb9783431cee347d5c68b0ce284cdc74bae1a3e3da11cc3729f0325e144e8c Apr 22 19:38:55.031035 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:55.031018 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:38:56.024528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:56.024496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" event={"ID":"71db560a-0382-4d72-bcad-ab0680af5d0a","Type":"ContainerStarted","Data":"121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968"} Apr 22 19:38:56.024528 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:56.024531 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" event={"ID":"71db560a-0382-4d72-bcad-ab0680af5d0a","Type":"ContainerStarted","Data":"aacb9783431cee347d5c68b0ce284cdc74bae1a3e3da11cc3729f0325e144e8c"} Apr 22 19:38:56.024971 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:56.024559 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:38:56.043003 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:38:56.042958 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" podStartSLOduration=2.042944961 podStartE2EDuration="2.042944961s" podCreationTimestamp="2026-04-22 19:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:38:56.041022702 +0000 UTC m=+956.463688009" watchObservedRunningTime="2026-04-22 19:38:56.042944961 +0000 UTC m=+956.465610259" Apr 22 19:39:02.034997 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:02.034966 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:39:04.647626 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:04.647591 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6"] Apr 22 19:39:04.648019 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:04.647878 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" containerID="cri-o://121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968" gracePeriod=30 Apr 22 19:39:07.032358 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:07.032318 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:12.031937 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:12.031886 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:17.031495 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:17.031449 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:17.031900 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:17.031574 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:39:22.032708 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:22.032614 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:27.036729 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:27.036691 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:32.032413 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:32.032373 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:34.788670 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:34.788647 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:39:34.858881 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:34.858849 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71db560a-0382-4d72-bcad-ab0680af5d0a-proxy-tls\") pod \"71db560a-0382-4d72-bcad-ab0680af5d0a\" (UID: \"71db560a-0382-4d72-bcad-ab0680af5d0a\") " Apr 22 19:39:34.859049 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:34.858894 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71db560a-0382-4d72-bcad-ab0680af5d0a-openshift-service-ca-bundle\") pod \"71db560a-0382-4d72-bcad-ab0680af5d0a\" (UID: \"71db560a-0382-4d72-bcad-ab0680af5d0a\") " Apr 22 19:39:34.859272 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:34.859248 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71db560a-0382-4d72-bcad-ab0680af5d0a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "71db560a-0382-4d72-bcad-ab0680af5d0a" (UID: "71db560a-0382-4d72-bcad-ab0680af5d0a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:39:34.860910 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:34.860889 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71db560a-0382-4d72-bcad-ab0680af5d0a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "71db560a-0382-4d72-bcad-ab0680af5d0a" (UID: "71db560a-0382-4d72-bcad-ab0680af5d0a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:34.959695 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:34.959592 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71db560a-0382-4d72-bcad-ab0680af5d0a-proxy-tls\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:39:34.959695 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:34.959638 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71db560a-0382-4d72-bcad-ab0680af5d0a-openshift-service-ca-bundle\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:39:35.136085 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.136042 2576 generic.go:358] "Generic (PLEG): container finished" podID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerID="121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968" exitCode=0 Apr 22 19:39:35.136254 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.136111 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" Apr 22 19:39:35.136254 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.136124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" event={"ID":"71db560a-0382-4d72-bcad-ab0680af5d0a","Type":"ContainerDied","Data":"121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968"} Apr 22 19:39:35.136254 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.136166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6" event={"ID":"71db560a-0382-4d72-bcad-ab0680af5d0a","Type":"ContainerDied","Data":"aacb9783431cee347d5c68b0ce284cdc74bae1a3e3da11cc3729f0325e144e8c"} Apr 22 19:39:35.136254 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.136186 2576 scope.go:117] "RemoveContainer" containerID="121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968" Apr 22 19:39:35.144190 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.144169 2576 scope.go:117] "RemoveContainer" containerID="121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968" Apr 22 19:39:35.144433 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:39:35.144409 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968\": container with ID starting with 121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968 not found: ID does not exist" containerID="121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968" Apr 22 19:39:35.144478 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.144442 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968"} err="failed to get container status \"121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968\": rpc error: code = NotFound desc = could not find container \"121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968\": container with ID starting with 121a20915d463ad49a66600e3250a3f0489f3e81af9ab7276d5e533cc03d7968 not found: ID does not exist" Apr 22 19:39:35.159348 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.159325 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6"] Apr 22 19:39:35.162448 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:35.162427 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ed925-68b69d48f5-777g6"] Apr 22 19:39:36.155312 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:39:36.155268 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" path="/var/lib/kubelet/pods/71db560a-0382-4d72-bcad-ab0680af5d0a/volumes" Apr 22 19:43:00.121280 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:43:00.121250 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:43:00.124060 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:43:00.124038 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:43:00.127941 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:43:00.127904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:43:00.130294 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:43:00.130278 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:47:48.745386 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.745348 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5mv67/must-gather-9g47d"] Apr 22 19:47:48.745791 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.745647 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" Apr 22 19:47:48.745791 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.745659 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" Apr 22 19:47:48.745791 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.745723 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="71db560a-0382-4d72-bcad-ab0680af5d0a" containerName="model-chainer-raw-hpa-ed925" Apr 22 19:47:48.748711 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.748694 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:47:48.753236 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.753202 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5mv67\"/\"openshift-service-ca.crt\"" Apr 22 19:47:48.753379 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.753278 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5mv67\"/\"kube-root-ca.crt\"" Apr 22 19:47:48.753379 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.753309 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5mv67\"/\"default-dockercfg-dsz5t\"" Apr 22 19:47:48.764715 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.764690 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5mv67/must-gather-9g47d"] Apr 22 19:47:48.824081 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.824046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g68b2\" (UniqueName: \"kubernetes.io/projected/fc03f249-e3ef-4dd0-b602-8ecf361389ee-kube-api-access-g68b2\") pod \"must-gather-9g47d\" (UID: \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\") " pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:47:48.824254 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.824091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fc03f249-e3ef-4dd0-b602-8ecf361389ee-must-gather-output\") pod \"must-gather-9g47d\" (UID: \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\") " pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:47:48.925485 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.925451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g68b2\" (UniqueName: \"kubernetes.io/projected/fc03f249-e3ef-4dd0-b602-8ecf361389ee-kube-api-access-g68b2\") pod \"must-gather-9g47d\" (UID: \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\") " pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:47:48.925485 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.925490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fc03f249-e3ef-4dd0-b602-8ecf361389ee-must-gather-output\") pod \"must-gather-9g47d\" (UID: \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\") " pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:47:48.925872 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.925853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fc03f249-e3ef-4dd0-b602-8ecf361389ee-must-gather-output\") pod \"must-gather-9g47d\" (UID: \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\") " pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:47:48.939906 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:48.939879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g68b2\" (UniqueName: \"kubernetes.io/projected/fc03f249-e3ef-4dd0-b602-8ecf361389ee-kube-api-access-g68b2\") pod \"must-gather-9g47d\" (UID: \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\") " pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:47:49.073549 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:49.073464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:47:49.194001 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:49.193903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5mv67/must-gather-9g47d"] Apr 22 19:47:49.196418 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:47:49.196390 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc03f249_e3ef_4dd0_b602_8ecf361389ee.slice/crio-4913b934da59e0bc5092938c1f5e013112276763600c7cf8f78d6dbcfa7f4d20 WatchSource:0}: Error finding container 4913b934da59e0bc5092938c1f5e013112276763600c7cf8f78d6dbcfa7f4d20: Status 404 returned error can't find the container with id 4913b934da59e0bc5092938c1f5e013112276763600c7cf8f78d6dbcfa7f4d20 Apr 22 19:47:49.198086 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:49.198070 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:47:49.567254 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:49.567214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mv67/must-gather-9g47d" event={"ID":"fc03f249-e3ef-4dd0-b602-8ecf361389ee","Type":"ContainerStarted","Data":"4913b934da59e0bc5092938c1f5e013112276763600c7cf8f78d6dbcfa7f4d20"} Apr 22 19:47:53.584748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:53.584644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mv67/must-gather-9g47d" event={"ID":"fc03f249-e3ef-4dd0-b602-8ecf361389ee","Type":"ContainerStarted","Data":"f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381"} Apr 22 19:47:53.584748 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:53.584691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mv67/must-gather-9g47d" event={"ID":"fc03f249-e3ef-4dd0-b602-8ecf361389ee","Type":"ContainerStarted","Data":"657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597"} Apr 22 19:47:53.606254 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:47:53.606202 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5mv67/must-gather-9g47d" podStartSLOduration=1.53084362 podStartE2EDuration="5.606184923s" podCreationTimestamp="2026-04-22 19:47:48 +0000 UTC" firstStartedPulling="2026-04-22 19:47:49.198192051 +0000 UTC m=+1489.620857340" lastFinishedPulling="2026-04-22 19:47:53.273533358 +0000 UTC m=+1493.696198643" observedRunningTime="2026-04-22 19:47:53.603530622 +0000 UTC m=+1494.026195941" watchObservedRunningTime="2026-04-22 19:47:53.606184923 +0000 UTC m=+1494.028850233" Apr 22 19:48:00.146801 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:00.146775 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:48:00.147864 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:00.147771 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:48:00.155674 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:00.155648 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:48:00.156259 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:00.156240 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:48:11.645642 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:11.645544 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerID="657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597" exitCode=0 Apr 22 19:48:11.645642 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:11.645628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mv67/must-gather-9g47d" event={"ID":"fc03f249-e3ef-4dd0-b602-8ecf361389ee","Type":"ContainerDied","Data":"657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597"} Apr 22 19:48:11.646145 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:11.646025 2576 scope.go:117] "RemoveContainer" containerID="657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597" Apr 22 19:48:12.389488 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:12.389454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mv67_must-gather-9g47d_fc03f249-e3ef-4dd0-b602-8ecf361389ee/gather/0.log" Apr 22 19:48:15.861721 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:15.861693 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mbwdb_60fd0bdf-71f1-4c96-a444-0de0f50d1c60/global-pull-secret-syncer/0.log" Apr 22 19:48:15.916123 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:15.916093 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cznmn_f4edf430-4780-4b0d-b495-50534d4ddccc/konnectivity-agent/0.log" Apr 22 19:48:16.088172 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:16.088143 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-15.ec2.internal_e477e0c5a4e100c99b94e03d47d9bc3f/haproxy/0.log" Apr 22 19:48:17.789256 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:17.789221 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5mv67/must-gather-9g47d"] Apr 22 19:48:17.789655 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:17.789453 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-5mv67/must-gather-9g47d" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerName="copy" containerID="cri-o://f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381" gracePeriod=2 Apr 22 19:48:17.795389 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:17.795352 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5mv67/must-gather-9g47d"] Apr 22 19:48:18.018111 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.018087 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mv67_must-gather-9g47d_fc03f249-e3ef-4dd0-b602-8ecf361389ee/copy/0.log" Apr 22 19:48:18.018480 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.018463 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:48:18.020954 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.020908 2576 status_manager.go:895] "Failed to get status for pod" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" pod="openshift-must-gather-5mv67/must-gather-9g47d" err="pods \"must-gather-9g47d\" is forbidden: User \"system:node:ip-10-0-138-15.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5mv67\": no relationship found between node 'ip-10-0-138-15.ec2.internal' and this object" Apr 22 19:48:18.180621 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.180593 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fc03f249-e3ef-4dd0-b602-8ecf361389ee-must-gather-output\") pod \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\" (UID: \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\") " Apr 22 19:48:18.180785 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.180695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g68b2\" (UniqueName: \"kubernetes.io/projected/fc03f249-e3ef-4dd0-b602-8ecf361389ee-kube-api-access-g68b2\") pod \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\" (UID: \"fc03f249-e3ef-4dd0-b602-8ecf361389ee\") " Apr 22 19:48:18.181938 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.181898 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc03f249-e3ef-4dd0-b602-8ecf361389ee-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fc03f249-e3ef-4dd0-b602-8ecf361389ee" (UID: "fc03f249-e3ef-4dd0-b602-8ecf361389ee"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:18.182830 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.182807 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc03f249-e3ef-4dd0-b602-8ecf361389ee-kube-api-access-g68b2" (OuterVolumeSpecName: "kube-api-access-g68b2") pod "fc03f249-e3ef-4dd0-b602-8ecf361389ee" (UID: "fc03f249-e3ef-4dd0-b602-8ecf361389ee"). InnerVolumeSpecName "kube-api-access-g68b2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:48:18.281328 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.281290 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g68b2\" (UniqueName: \"kubernetes.io/projected/fc03f249-e3ef-4dd0-b602-8ecf361389ee-kube-api-access-g68b2\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:48:18.281328 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.281319 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fc03f249-e3ef-4dd0-b602-8ecf361389ee-must-gather-output\") on node \"ip-10-0-138-15.ec2.internal\" DevicePath \"\"" Apr 22 19:48:18.667503 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.667473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mv67_must-gather-9g47d_fc03f249-e3ef-4dd0-b602-8ecf361389ee/copy/0.log" Apr 22 19:48:18.667834 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.667803 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerID="f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381" exitCode=143 Apr 22 19:48:18.667964 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.667855 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mv67/must-gather-9g47d" Apr 22 19:48:18.667964 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.667912 2576 scope.go:117] "RemoveContainer" containerID="f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381" Apr 22 19:48:18.675419 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.675401 2576 scope.go:117] "RemoveContainer" containerID="657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597" Apr 22 19:48:18.687326 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.687303 2576 scope.go:117] "RemoveContainer" containerID="f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381" Apr 22 19:48:18.687577 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:48:18.687559 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381\": container with ID starting with f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381 not found: ID does not exist" containerID="f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381" Apr 22 19:48:18.687624 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.687585 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381"} err="failed to get container status \"f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381\": rpc error: code = NotFound desc = could not find container \"f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381\": container with ID starting with f4fe50000a15eb4c3a4462a82e19285bee3b246bd7945fa20ff639f3f7316381 not found: ID does not exist" Apr 22 19:48:18.687624 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.687604 2576 scope.go:117] "RemoveContainer" containerID="657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597" Apr 22 19:48:18.687856 ip-10-0-138-15 kubenswrapper[2576]: E0422 19:48:18.687837 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597\": container with ID starting with 657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597 not found: ID does not exist" containerID="657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597" Apr 22 19:48:18.687907 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:18.687862 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597"} err="failed to get container status \"657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597\": rpc error: code = NotFound desc = could not find container \"657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597\": container with ID starting with 657bb404c6a2ac8c347e526d137f20f45ca3bd047a841e5b1d254faa6ace0597 not found: ID does not exist" Apr 22 19:48:19.967015 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:19.966977 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6477979999-8mjqj_f7fe6d59-8c0a-41d8-b794-8912d6ef43e9/metrics-server/0.log" Apr 22 19:48:20.033586 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.033521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8dn6z_c24fec81-fd18-43a5-884d-38c5bb7a71ab/node-exporter/0.log" Apr 22 19:48:20.061442 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.061413 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8dn6z_c24fec81-fd18-43a5-884d-38c5bb7a71ab/kube-rbac-proxy/0.log" Apr 22 19:48:20.093201 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.093174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8dn6z_c24fec81-fd18-43a5-884d-38c5bb7a71ab/init-textfile/0.log" Apr 22 19:48:20.155586 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.155562 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" path="/var/lib/kubelet/pods/fc03f249-e3ef-4dd0-b602-8ecf361389ee/volumes" Apr 22 19:48:20.394648 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.394618 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dc35bc67-b5e8-4eda-b2dc-068614394573/prometheus/0.log" Apr 22 19:48:20.417457 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.417432 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dc35bc67-b5e8-4eda-b2dc-068614394573/config-reloader/0.log" Apr 22 19:48:20.443538 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.443512 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dc35bc67-b5e8-4eda-b2dc-068614394573/thanos-sidecar/0.log" Apr 22 19:48:20.469018 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.468996 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dc35bc67-b5e8-4eda-b2dc-068614394573/kube-rbac-proxy-web/0.log" Apr 22 19:48:20.501810 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.501783 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dc35bc67-b5e8-4eda-b2dc-068614394573/kube-rbac-proxy/0.log" Apr 22 19:48:20.534902 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.534851 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dc35bc67-b5e8-4eda-b2dc-068614394573/kube-rbac-proxy-thanos/0.log" Apr 22 19:48:20.570168 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.570140 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dc35bc67-b5e8-4eda-b2dc-068614394573/init-config-reloader/0.log" Apr 22 19:48:20.677996 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.677907 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-j24jl_5ecb5e1b-abd7-4365-a2b6-55632e29bd79/prometheus-operator-admission-webhook/0.log" Apr 22 19:48:20.884906 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.884879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c999d4f58-km8q5_73cfb30c-6eb9-4213-879f-37b04ae3abe9/thanos-query/0.log" Apr 22 19:48:20.913365 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.913343 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c999d4f58-km8q5_73cfb30c-6eb9-4213-879f-37b04ae3abe9/kube-rbac-proxy-web/0.log" Apr 22 19:48:20.950006 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.949942 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c999d4f58-km8q5_73cfb30c-6eb9-4213-879f-37b04ae3abe9/kube-rbac-proxy/0.log" Apr 22 19:48:20.974941 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:20.974906 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c999d4f58-km8q5_73cfb30c-6eb9-4213-879f-37b04ae3abe9/prom-label-proxy/0.log" Apr 22 19:48:21.005629 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:21.005607 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c999d4f58-km8q5_73cfb30c-6eb9-4213-879f-37b04ae3abe9/kube-rbac-proxy-rules/0.log" Apr 22 19:48:21.037644 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:21.037621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c999d4f58-km8q5_73cfb30c-6eb9-4213-879f-37b04ae3abe9/kube-rbac-proxy-metrics/0.log" Apr 22 19:48:22.060869 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.060837 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-9f56t_18fa20f4-e79f-4f01-9142-38e98b2350d6/networking-console-plugin/0.log" Apr 22 19:48:22.475841 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.475809 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/1.log" Apr 22 19:48:22.480362 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.480342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8hr98_23701c89-5e40-43ee-bab7-fe2709643e97/console-operator/2.log" Apr 22 19:48:22.920220 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.920187 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt"] Apr 22 19:48:22.920500 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.920487 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerName="gather" Apr 22 19:48:22.920545 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.920502 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerName="gather" Apr 22 19:48:22.920545 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.920523 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerName="copy" Apr 22 19:48:22.920545 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.920529 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerName="copy" Apr 22 19:48:22.920638 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.920576 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerName="gather" Apr 22 19:48:22.920638 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.920584 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc03f249-e3ef-4dd0-b602-8ecf361389ee" containerName="copy" Apr 22 19:48:22.925610 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.925592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:22.929625 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.929600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-skx6f\"/\"kube-root-ca.crt\"" Apr 22 19:48:22.930971 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.930946 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-skx6f\"/\"openshift-service-ca.crt\"" Apr 22 19:48:22.931103 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.931054 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-skx6f\"/\"default-dockercfg-6vwdj\"" Apr 22 19:48:22.933130 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:22.933108 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt"] Apr 22 19:48:23.009263 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.009237 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-swq4t_fd57b9ed-3208-41fd-aab6-8c6d3078a852/download-server/0.log" Apr 22 19:48:23.020662 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.020629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-podres\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.020793 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.020667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-sys\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.020793 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.020708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-proc\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.020793 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.020743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st62s\" (UniqueName: \"kubernetes.io/projected/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-kube-api-access-st62s\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.020793 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.020764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-lib-modules\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.121674 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-podres\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.121674 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-sys\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.122127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-proc\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.122127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-st62s\" (UniqueName: \"kubernetes.io/projected/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-kube-api-access-st62s\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.122127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-lib-modules\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.122127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-sys\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.122127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-proc\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.122127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-podres\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.122127 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.121861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-lib-modules\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.131193 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.131176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-st62s\" (UniqueName: \"kubernetes.io/projected/bbdc4f43-b7cc-417a-8a7e-23f5ee932021-kube-api-access-st62s\") pod \"perf-node-gather-daemonset-9n2zt\" (UID: \"bbdc4f43-b7cc-417a-8a7e-23f5ee932021\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.236870 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.236775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.360867 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.360842 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt"] Apr 22 19:48:23.362936 ip-10-0-138-15 kubenswrapper[2576]: W0422 19:48:23.362892 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbbdc4f43_b7cc_417a_8a7e_23f5ee932021.slice/crio-5f48e008d0889ba083c562dc77b015dd77fd945d155c10d2569f4e8ab143897a WatchSource:0}: Error finding container 5f48e008d0889ba083c562dc77b015dd77fd945d155c10d2569f4e8ab143897a: Status 404 returned error can't find the container with id 5f48e008d0889ba083c562dc77b015dd77fd945d155c10d2569f4e8ab143897a Apr 22 19:48:23.687611 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.687569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" event={"ID":"bbdc4f43-b7cc-417a-8a7e-23f5ee932021","Type":"ContainerStarted","Data":"6d2abf0d93987dc1461ea8cf81faca9045396fb7102de1c6b6c4cccd2f90ccc7"} Apr 22 19:48:23.687762 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.687615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" event={"ID":"bbdc4f43-b7cc-417a-8a7e-23f5ee932021","Type":"ContainerStarted","Data":"5f48e008d0889ba083c562dc77b015dd77fd945d155c10d2569f4e8ab143897a"} Apr 22 19:48:23.687762 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.687729 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:23.715912 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:23.715856 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" podStartSLOduration=1.715842831 podStartE2EDuration="1.715842831s" podCreationTimestamp="2026-04-22 19:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:48:23.714443374 +0000 UTC m=+1524.137108681" watchObservedRunningTime="2026-04-22 19:48:23.715842831 +0000 UTC m=+1524.138508138" Apr 22 19:48:24.259469 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:24.259435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n2sn2_d0674059-fa3f-4411-bea0-5b58dca69acc/dns/0.log" Apr 22 19:48:24.287090 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:24.287062 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n2sn2_d0674059-fa3f-4411-bea0-5b58dca69acc/kube-rbac-proxy/0.log" Apr 22 19:48:24.434097 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:24.434065 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hftfk_4caf5855-872c-4886-aec4-eb966cfeb4c3/dns-node-resolver/0.log" Apr 22 19:48:24.925506 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:24.925476 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-47nkl_7eb8b708-4ebf-4d5f-b8a0-ee69ff963778/node-ca/0.log" Apr 22 19:48:26.212535 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:26.212507 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vk4gw_0094a2f2-1687-4429-a919-c7f5d7498255/serve-healthcheck-canary/0.log" Apr 22 19:48:26.682894 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:26.682857 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fn6js_37daebbc-30c3-4548-b751-2f66c70271fa/kube-rbac-proxy/0.log" Apr 22 19:48:26.712826 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:26.712797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fn6js_37daebbc-30c3-4548-b751-2f66c70271fa/exporter/0.log" Apr 22 19:48:26.744050 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:26.744024 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fn6js_37daebbc-30c3-4548-b751-2f66c70271fa/extractor/0.log" Apr 22 19:48:29.181424 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:29.181397 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-9rgth_b82d74ee-a068-4108-bba3-2a0af669f857/s3-init/0.log" Apr 22 19:48:29.701015 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:29.700989 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-9n2zt" Apr 22 19:48:34.101406 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:34.101372 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9776m_44c0d1dd-5d1c-443a-a71d-40d163d60028/kube-storage-version-migrator-operator/1.log" Apr 22 19:48:34.102128 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:34.102110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9776m_44c0d1dd-5d1c-443a-a71d-40d163d60028/kube-storage-version-migrator-operator/0.log" Apr 22 19:48:35.085107 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.085076 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hbdj_280aa335-840b-490c-a36f-0cdef337ab79/kube-multus-additional-cni-plugins/0.log" Apr 22 19:48:35.113726 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.113692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hbdj_280aa335-840b-490c-a36f-0cdef337ab79/egress-router-binary-copy/0.log" Apr 22 19:48:35.142998 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.142972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hbdj_280aa335-840b-490c-a36f-0cdef337ab79/cni-plugins/0.log" Apr 22 19:48:35.173661 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.173640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hbdj_280aa335-840b-490c-a36f-0cdef337ab79/bond-cni-plugin/0.log" Apr 22 19:48:35.199809 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.199785 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hbdj_280aa335-840b-490c-a36f-0cdef337ab79/routeoverride-cni/0.log" Apr 22 19:48:35.225939 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.225899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hbdj_280aa335-840b-490c-a36f-0cdef337ab79/whereabouts-cni-bincopy/0.log" Apr 22 19:48:35.251688 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.251663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hbdj_280aa335-840b-490c-a36f-0cdef337ab79/whereabouts-cni/0.log" Apr 22 19:48:35.690367 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.690339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jxhs2_ee930469-602e-4383-9900-a97a25da678b/kube-multus/0.log" Apr 22 19:48:35.857644 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.857616 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-m8fmk_957d9773-bf39-486e-a32e-eba60e7b49e9/network-metrics-daemon/0.log" Apr 22 19:48:35.883002 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:35.882980 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-m8fmk_957d9773-bf39-486e-a32e-eba60e7b49e9/kube-rbac-proxy/0.log" Apr 22 19:48:37.535780 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.535755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-controller/0.log" Apr 22 19:48:37.560276 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.560246 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/0.log" Apr 22 19:48:37.567377 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.567356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovn-acl-logging/1.log" Apr 22 19:48:37.595570 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.595546 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/kube-rbac-proxy-node/0.log" Apr 22 19:48:37.626107 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.626079 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:48:37.650334 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.650309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/northd/0.log" Apr 22 19:48:37.676662 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.676622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/nbdb/0.log" Apr 22 19:48:37.704759 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.704737 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/sbdb/0.log" Apr 22 19:48:37.810039 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:37.809971 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w57k8_fd0e1c46-4f51-455c-8267-abe0b6eacfd9/ovnkube-controller/0.log" Apr 22 19:48:38.862679 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:38.862653 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-dppdk_dd892729-a03a-4e23-814f-c6d8f9c486d8/check-endpoints/0.log" Apr 22 19:48:38.926728 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:38.926689 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wrvx6_7ee8fcaa-f3c9-4f0e-b1c5-e6fe0fabbfeb/network-check-target-container/0.log" Apr 22 19:48:39.927147 ip-10-0-138-15 kubenswrapper[2576]: I0422 19:48:39.927116 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qpgmr_be8c9b4d-d8b6-438d-adfb-b1521f3c0d84/iptables-alerter/0.log"