Apr 17 20:44:08.176101 ip-10-0-137-102 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:44:08.537156 ip-10-0-137-102 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:44:08.537156 ip-10-0-137-102 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:44:08.537156 ip-10-0-137-102 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:44:08.537156 ip-10-0-137-102 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:44:08.537156 ip-10-0-137-102 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:44:08.538701 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.538625 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:44:08.542841 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542817 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:44:08.542841 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542840 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:44:08.542841 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542844 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:44:08.542841 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542848 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:44:08.542841 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542851 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542855 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542858 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542862 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542864 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542868 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542870 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542873 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542877 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542879 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542883 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542885 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542888 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542890 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542893 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542895 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542898 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542900 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542903 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542905 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:44:08.542990 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542908 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542911 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542913 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542916 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542918 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542921 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542924 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542926 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542929 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542932 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542934 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542937 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542940 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542942 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542946 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542948 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542951 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542954 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542956 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542959 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:44:08.543503 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542961 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542967 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542970 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542973 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542977 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542981 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542984 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542987 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542990 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542993 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542996 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.542999 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543001 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543004 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543007 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543009 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543012 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543015 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543018 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:44:08.544016 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543020 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543023 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543026 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543028 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543032 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543034 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543037 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543041 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543044 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543047 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543049 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543052 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543055 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543057 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543060 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543063 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543066 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543068 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543071 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543074 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:44:08.544552 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543076 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543079 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543081 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543447 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543451 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543454 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543457 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543459 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543462 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543465 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543467 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543470 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543472 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543475 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543478 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543480 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543483 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543485 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543488 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543491 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:44:08.545068 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543494 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543497 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543499 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543502 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543505 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543507 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543510 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543514 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543517 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543520 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543523 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543525 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543528 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543531 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543533 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543535 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543538 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543541 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543543 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543546 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:44:08.545542 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543548 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543551 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543553 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543557 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543560 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543562 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543565 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543568 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543571 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543574 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543576 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543579 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543582 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543584 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543587 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543589 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543592 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543594 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543597 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543599 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:44:08.546054 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543602 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543605 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543607 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543610 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543612 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543614 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543617 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543620 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543622 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543625 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543627 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543629 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543632 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543635 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543637 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543640 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543643 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543645 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543648 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543650 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:44:08.546551 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543653 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543658 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543662 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543665 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543667 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543670 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543673 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543675 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.543678 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544816 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544831 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544839 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544844 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544849 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544852 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544857 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544861 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544865 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544868 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544872 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544875 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:44:08.547058 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544878 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544881 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544884 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544887 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544890 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544893 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544896 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544901 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544904 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544907 2572 flags.go:64] FLAG: --config-dir="" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544910 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544913 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544917 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544920 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544924 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544927 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544930 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544933 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544936 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544939 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544942 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544947 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544950 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544953 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544955 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:44:08.547567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544959 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544962 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544967 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544970 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544973 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544977 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544980 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544984 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544987 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544990 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544993 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.544996 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545001 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545004 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545008 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545010 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545013 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545017 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545020 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545023 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545026 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545030 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545033 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545036 2572 flags.go:64] FLAG: --help="false" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545039 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.548200 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545042 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545045 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545048 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545052 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545055 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545058 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545062 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545065 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545068 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545071 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545074 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545077 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545080 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545083 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545086 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545089 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545092 2572 flags.go:64] FLAG: --lock-file="" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545095 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545098 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545102 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545111 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545114 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545116 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:44:08.548856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545119 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545122 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545126 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545129 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545132 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545136 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545140 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545144 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545147 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545150 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545153 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545156 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545159 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545162 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545165 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545172 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545175 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545178 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545183 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545186 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545191 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545194 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545197 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545200 2572 flags.go:64] FLAG: --port="10250" Apr 17 20:44:08.549412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545203 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545207 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-052fd1b6cce1a7e3e" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545210 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545214 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545219 2572 flags.go:64] FLAG: --register-node="true" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545222 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545227 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545230 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545233 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545236 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545239 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545244 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545247 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545250 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545253 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545255 2572 flags.go:64] FLAG: --runonce="false" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545258 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545261 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545264 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545267 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545270 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545273 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545276 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545279 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545282 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545286 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:44:08.550020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545289 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545292 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545295 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545298 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545301 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545306 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545309 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545312 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545315 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545318 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545322 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545325 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545329 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545332 2572 flags.go:64] FLAG: --v="2" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545336 2572 flags.go:64] FLAG: --version="false" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545341 2572 flags.go:64] FLAG: --vmodule="" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545345 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545348 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545441 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545445 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545448 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545450 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545453 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545456 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:44:08.550725 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545458 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545461 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545464 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545466 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545469 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545471 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545474 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545476 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545479 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545481 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545484 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545487 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545490 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545492 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545495 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545497 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545500 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545503 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545507 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545509 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:44:08.551729 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545513 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545516 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545518 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545521 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545523 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545526 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545528 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545531 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545534 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545536 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545539 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545542 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545544 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545547 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545549 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545552 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545555 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545558 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545560 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545563 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:44:08.552520 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545565 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545568 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545571 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545573 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545576 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545578 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545581 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545583 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545586 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545590 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545595 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545598 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545602 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545604 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545607 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545610 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545613 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545616 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545619 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:44:08.553285 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545621 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545624 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545627 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545629 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545632 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545634 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545637 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545640 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545642 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545645 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545647 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545650 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545652 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545655 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545657 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545660 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545663 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545666 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545670 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:44:08.553760 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545673 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:44:08.554249 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.545676 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:44:08.554249 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.545681 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:44:08.554249 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.554175 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:44:08.554249 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.554193 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:44:08.554249 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554240 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:44:08.554249 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554245 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:44:08.554249 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554248 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:44:08.554249 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554251 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554256 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554261 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554264 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554267 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554270 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554273 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554275 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554278 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554281 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554283 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554286 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554288 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554291 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554294 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554298 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554302 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554305 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554308 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:44:08.554457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554310 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554313 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554315 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554318 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554322 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554324 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554327 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554329 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554332 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554335 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554338 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554340 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554343 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554345 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554349 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554352 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554355 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554357 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554360 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554363 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:44:08.555045 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554365 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554368 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554370 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554373 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554376 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554378 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554381 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554384 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554386 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554389 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554392 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554394 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554397 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554399 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554402 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554404 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554407 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554409 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554412 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554415 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:44:08.555533 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554418 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554420 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554423 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554426 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554429 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554431 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554434 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554437 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554440 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554442 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554445 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554448 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554450 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554453 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554456 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554459 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554462 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554464 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554467 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:44:08.556042 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554470 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554472 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554475 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554478 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554480 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.554485 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554578 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554583 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554586 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554590 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554594 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554597 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554599 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554602 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554605 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554608 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:44:08.556504 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554610 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554613 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554616 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554618 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554621 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554624 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554626 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554629 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554632 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554634 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554637 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554639 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554642 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554645 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554647 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554649 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554652 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554654 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554657 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:44:08.556926 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554659 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554662 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554664 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554667 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554670 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554672 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554675 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554677 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554680 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554683 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554685 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554688 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554690 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554693 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554696 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554698 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554701 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554703 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554706 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554708 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:44:08.557381 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554711 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554714 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554716 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554719 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554721 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554723 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554727 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554731 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554734 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554737 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554740 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554743 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554746 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554749 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554751 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554754 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554756 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554759 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554762 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554764 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:44:08.557891 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554767 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554769 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554772 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554774 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554777 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554780 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554782 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554784 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554787 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554790 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554792 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554795 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554797 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554814 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554819 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554822 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:44:08.558406 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:08.554824 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:44:08.558950 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.554830 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:44:08.558950 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.554936 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:44:08.558950 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.558547 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:44:08.559346 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.559334 2572 server.go:1019] "Starting client certificate rotation" Apr 17 20:44:08.559452 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.559435 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:44:08.559488 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.559473 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:44:08.581547 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.581529 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:44:08.583213 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.583195 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:44:08.601815 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.601784 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:44:08.606905 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.606888 2572 log.go:25] "Validated CRI v1 image API" Apr 17 20:44:08.609770 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.609749 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:44:08.611482 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.611465 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:44:08.613111 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.613093 2572 fs.go:135] Filesystem UUIDs: map[14cedb09-cbca-4f11-abca-f7710f365d0c:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d5b853fe-9744-46f5-9396-fc69ecf7e74c:/dev/nvme0n1p4] Apr 17 20:44:08.613184 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.613112 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:44:08.619155 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.619041 2572 manager.go:217] Machine: {Timestamp:2026-04-17 20:44:08.617450713 +0000 UTC m=+0.342718065 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3072729 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28de0129c1447e6ce893a2afadf518 SystemUUID:ec28de01-29c1-447e-6ce8-93a2afadf518 BootID:87b534d9-811d-405f-bd39-2e0ba9ac5b00 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e8:b6:fe:45:93 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e8:b6:fe:45:93 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:d5:50:06:49:7b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:44:08.619699 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.619689 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:44:08.619838 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.619825 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:44:08.620740 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.620721 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:44:08.620890 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.620742 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-102.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:44:08.620938 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.620899 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:44:08.620938 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.620909 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:44:08.620938 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.620923 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:44:08.621018 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.620939 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:44:08.622272 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.622261 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:44:08.622371 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.622363 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:44:08.624932 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.624923 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:44:08.624966 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.624938 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:44:08.624966 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.624949 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:44:08.624966 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.624957 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:44:08.624966 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.624965 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:44:08.625920 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.625909 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:44:08.625956 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.625927 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:44:08.628276 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.628258 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:44:08.630092 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.630073 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:44:08.632231 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632217 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632235 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632242 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632247 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632253 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632259 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632265 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632270 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632277 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632283 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632293 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:44:08.632306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.632302 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:44:08.633077 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.633062 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:44:08.633120 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.633081 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:44:08.633517 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.633499 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-422mn" Apr 17 20:44:08.634142 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.634122 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:44:08.634142 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.634127 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:44:08.636243 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.636222 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-102.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:44:08.637067 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.637054 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:44:08.637114 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.637091 2572 server.go:1295] "Started kubelet" Apr 17 20:44:08.637215 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.637177 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:44:08.637273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.637235 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:44:08.637571 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.637530 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:44:08.637973 ip-10-0-137-102 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:44:08.638326 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.638310 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:44:08.638958 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.638934 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-422mn" Apr 17 20:44:08.639910 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.639897 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:44:08.644581 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.644562 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:44:08.645038 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645021 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:44:08.645116 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.645068 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:44:08.645741 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645648 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:44:08.645741 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645650 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:44:08.645741 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645672 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:44:08.645741 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.645693 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:08.645741 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645728 2572 factory.go:55] Registering systemd factory Apr 17 20:44:08.645741 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645734 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:44:08.645741 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645739 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:44:08.645741 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645742 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:44:08.646175 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645952 2572 factory.go:153] Registering CRI-O factory Apr 17 20:44:08.646175 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.645965 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 20:44:08.646175 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.646012 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:44:08.646175 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.646031 2572 factory.go:103] Registering Raw factory Apr 17 20:44:08.646175 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.646044 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 20:44:08.646411 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.646379 2572 manager.go:319] Starting recovery of all containers Apr 17 20:44:08.647252 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.647231 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:44:08.652968 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.652949 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-102.ec2.internal\" not found" node="ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.659641 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.659627 2572 manager.go:324] Recovery completed Apr 17 20:44:08.663383 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.663365 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:44:08.666078 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.666065 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:44:08.666135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.666092 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:44:08.666135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.666102 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:44:08.666761 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.666745 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:44:08.666761 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.666758 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:44:08.666872 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.666773 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:44:08.670124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.670112 2572 policy_none.go:49] "None policy: Start" Apr 17 20:44:08.670182 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.670128 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:44:08.670182 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.670138 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.706708 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.707844 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.707874 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.707892 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.707898 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.707959 2572 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.711395 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.711430 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.711446 2572 server.go:85] "Starting device plugin registration server" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.711716 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.711730 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.711824 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.711934 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.711946 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.712225 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.712394 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:44:08.721298 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.712441 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:08.808950 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.808871 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal"] Apr 17 20:44:08.809057 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.808968 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:44:08.811021 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.811002 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:44:08.811091 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.811035 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:44:08.811091 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.811045 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:44:08.812076 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.812062 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:44:08.812769 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.812746 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:44:08.812912 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.812781 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:44:08.812912 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.812794 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:44:08.812912 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.812842 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.813269 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813256 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:44:08.813400 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813388 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.813443 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813414 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:44:08.813908 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813884 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:44:08.813908 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813903 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:44:08.814029 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813915 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:44:08.814029 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813921 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:44:08.814029 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813926 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:44:08.814029 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.813930 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:44:08.816116 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.816104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.816173 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.816127 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:44:08.816734 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.816718 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:44:08.816794 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.816747 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:44:08.816794 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.816763 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:44:08.820446 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.820434 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.820500 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.820454 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-102.ec2.internal\": node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:08.839311 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.839289 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:08.839449 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.839433 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-102.ec2.internal\" not found" node="ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.843637 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.843621 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-102.ec2.internal\" not found" node="ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.847339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.847325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7435db675ee99a271ee76030c18b240-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"e7435db675ee99a271ee76030c18b240\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.847398 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.847346 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a467953d27af2c37f58628655d1416c-config\") pod \"kube-apiserver-proxy-ip-10-0-137-102.ec2.internal\" (UID: \"7a467953d27af2c37f58628655d1416c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.847398 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.847364 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7435db675ee99a271ee76030c18b240-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"e7435db675ee99a271ee76030c18b240\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.939837 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:08.939796 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:08.948167 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.948144 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7435db675ee99a271ee76030c18b240-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"e7435db675ee99a271ee76030c18b240\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.948248 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.948190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7435db675ee99a271ee76030c18b240-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"e7435db675ee99a271ee76030c18b240\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.948303 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.948250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7435db675ee99a271ee76030c18b240-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"e7435db675ee99a271ee76030c18b240\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.948303 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.948282 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a467953d27af2c37f58628655d1416c-config\") pod \"kube-apiserver-proxy-ip-10-0-137-102.ec2.internal\" (UID: \"7a467953d27af2c37f58628655d1416c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.948385 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.948316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a467953d27af2c37f58628655d1416c-config\") pod \"kube-apiserver-proxy-ip-10-0-137-102.ec2.internal\" (UID: \"7a467953d27af2c37f58628655d1416c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 17 20:44:08.948385 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:08.948320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7435db675ee99a271ee76030c18b240-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal\" (UID: \"e7435db675ee99a271ee76030c18b240\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:09.040103 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:09.040074 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:09.140901 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:09.140827 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:09.142981 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.142961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:09.146556 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.146540 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 17 20:44:09.241410 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:09.241379 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:09.341936 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:09.341911 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:09.442465 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:09.442384 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:09.543013 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:09.542976 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-102.ec2.internal\" not found" Apr 17 20:44:09.559428 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.559400 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:44:09.559572 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.559559 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:44:09.559628 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.559605 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:44:09.630558 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.630400 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:44:09.640442 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.640415 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 20:39:08 +0000 UTC" deadline="2027-11-30 21:18:39.271956949 +0000 UTC" Apr 17 20:44:09.640493 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.640441 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14208h34m29.631518536s" Apr 17 20:44:09.645300 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.645279 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" Apr 17 20:44:09.645415 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.645302 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:44:09.654333 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.654312 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:44:09.656106 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.656088 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:44:09.656186 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.656095 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" Apr 17 20:44:09.661531 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.661516 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:44:09.674598 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:09.674574 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7435db675ee99a271ee76030c18b240.slice/crio-27f815063c6ea439e13ec6125d5b8f6733c5d5532c6aab10e1de00704a47c0b7 WatchSource:0}: Error finding container 27f815063c6ea439e13ec6125d5b8f6733c5d5532c6aab10e1de00704a47c0b7: Status 404 returned error can't find the container with id 27f815063c6ea439e13ec6125d5b8f6733c5d5532c6aab10e1de00704a47c0b7 Apr 17 20:44:09.677048 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:09.677023 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a467953d27af2c37f58628655d1416c.slice/crio-5656ea0b4df9a2b6521a6d128dd83435550bf52fc63611761438df227b41675f WatchSource:0}: Error finding container 5656ea0b4df9a2b6521a6d128dd83435550bf52fc63611761438df227b41675f: Status 404 returned error can't find the container with id 5656ea0b4df9a2b6521a6d128dd83435550bf52fc63611761438df227b41675f Apr 17 20:44:09.677125 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.677086 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v9xd9" Apr 17 20:44:09.679673 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.679658 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:44:09.684117 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.684092 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v9xd9" Apr 17 20:44:09.711223 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.711126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" event={"ID":"e7435db675ee99a271ee76030c18b240","Type":"ContainerStarted","Data":"27f815063c6ea439e13ec6125d5b8f6733c5d5532c6aab10e1de00704a47c0b7"} Apr 17 20:44:09.712135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:09.712117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" event={"ID":"7a467953d27af2c37f58628655d1416c","Type":"ContainerStarted","Data":"5656ea0b4df9a2b6521a6d128dd83435550bf52fc63611761438df227b41675f"} Apr 17 20:44:10.019638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.019570 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:44:10.377293 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.377226 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:44:10.625420 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.625393 2572 apiserver.go:52] "Watching apiserver" Apr 17 20:44:10.630186 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.630122 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:44:10.630520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.630489 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7","openshift-image-registry/node-ca-vh6vg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal","openshift-multus/multus-additional-cni-plugins-66wwv","openshift-multus/multus-k7xlk","openshift-multus/network-metrics-daemon-gggxp","kube-system/konnectivity-agent-bqh55","openshift-cluster-node-tuning-operator/tuned-ncgcb","openshift-dns/node-resolver-kk8ft","openshift-network-diagnostics/network-check-target-4595v","openshift-network-operator/iptables-alerter-542j9","openshift-ovn-kubernetes/ovnkube-node-trqtd","kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal"] Apr 17 20:44:10.632956 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.632932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:10.634849 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.634826 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:44:10.634942 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.634851 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:44:10.635031 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.635015 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-spbq6\"" Apr 17 20:44:10.635146 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.635128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.636735 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.636717 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:44:10.637212 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.637014 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:44:10.637212 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.637054 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:44:10.637212 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.637080 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-s5rtq\"" Apr 17 20:44:10.637421 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.637298 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.638974 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.638937 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:44:10.639092 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.639068 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8dbbt\"" Apr 17 20:44:10.639183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.639072 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:44:10.639183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.639160 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:44:10.639308 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.639290 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:44:10.639399 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.639354 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:44:10.639534 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.639521 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.642239 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.641402 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:44:10.642239 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.641625 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4bp6l\"" Apr 17 20:44:10.643442 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.643420 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:10.643524 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:10.643502 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:10.647725 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.647703 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.647865 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.647815 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.649677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.649516 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:44:10.649677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.649519 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:44:10.649677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.649554 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kbs4v\"" Apr 17 20:44:10.649677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.649601 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:44:10.649677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.649608 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:44:10.649677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.649617 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:44:10.650071 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.649740 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7zmnk\"" Apr 17 20:44:10.650071 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.650046 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.651647 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.651622 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:44:10.651745 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.651714 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-694gh\"" Apr 17 20:44:10.651824 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.651761 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:44:10.652342 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.652322 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:10.652423 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:10.652393 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:10.654728 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.654707 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.656522 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.656503 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:44:10.656615 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.656559 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:44:10.656777 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.656749 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:44:10.656935 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.656903 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xdkzp\"" Apr 17 20:44:10.657179 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.657358 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657336 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysconfig\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.657425 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysctl-d\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.657425 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-systemd\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.657521 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657443 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f090e67f-4a19-4f45-a57b-d70ee9f84598-konnectivity-ca\") pod \"konnectivity-agent-bqh55\" (UID: \"f090e67f-4a19-4f45-a57b-d70ee9f84598\") " pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:10.657521 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657470 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-system-cni-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.657521 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-cnibin\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.657663 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657535 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-kubernetes\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.657663 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-cnibin\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.657663 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-cni-binary-copy\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.657663 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqpv\" (UniqueName: \"kubernetes.io/projected/83681f84-53f3-489d-9b30-0db22fc1b40e-kube-api-access-xxqpv\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:10.657663 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-registration-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.657663 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657655 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-device-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2cxl\" (UniqueName: \"kubernetes.io/projected/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-kube-api-access-s2cxl\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s8mg\" (UniqueName: \"kubernetes.io/projected/73fda819-3b91-4892-92f0-995a9a9014c8-kube-api-access-9s8mg\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657820 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7mm2\" (UniqueName: \"kubernetes.io/projected/486c966c-4220-4865-a777-76f49bb4fa62-kube-api-access-w7mm2\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgz7l\" (UniqueName: \"kubernetes.io/projected/441d62c8-1471-4e50-af5c-5f5a36a868f7-kube-api-access-hgz7l\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/245c9064-c71a-4dce-bc16-4f45702063cf-tmp\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657915 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgvs7\" (UniqueName: \"kubernetes.io/projected/245c9064-c71a-4dce-bc16-4f45702063cf-kube-api-access-kgvs7\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-hostroot\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.657967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.657985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-socket-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysctl-conf\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658029 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f090e67f-4a19-4f45-a57b-d70ee9f84598-agent-certs\") pod \"konnectivity-agent-bqh55\" (UID: \"f090e67f-4a19-4f45-a57b-d70ee9f84598\") " pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-os-release\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-cni-multus\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-kubelet\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-conf-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658133 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-var-lib-kubelet\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-host\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/245c9064-c71a-4dce-bc16-4f45702063cf-etc-tuned\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-daemon-config\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658256 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-modprobe-d\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658283 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73fda819-3b91-4892-92f0-995a9a9014c8-host\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658312 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:10.658365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-sys\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-socket-dir-parent\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-cni-bin\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/73fda819-3b91-4892-92f0-995a9a9014c8-serviceca\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-system-cni-dir\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-os-release\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658529 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-sys-fs\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658565 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-cni-binary-copy\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-etc-kubernetes\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-run\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-lib-modules\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-cni-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-k8s-cni-cncf-io\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-netns\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658789 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-multus-certs\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.658870 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.659063 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:44:10.659163 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.659100 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:44:10.660164 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.659190 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rgfn6\"" Apr 17 20:44:10.660164 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.659344 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:44:10.660164 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.659412 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:44:10.684856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.684774 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:39:09 +0000 UTC" deadline="2027-10-17 04:32:16.001931773 +0000 UTC" Apr 17 20:44:10.684856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.684797 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13135h48m5.317138042s" Apr 17 20:44:10.746717 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.746696 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:44:10.759296 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.759427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f090e67f-4a19-4f45-a57b-d70ee9f84598-agent-certs\") pod \"konnectivity-agent-bqh55\" (UID: \"f090e67f-4a19-4f45-a57b-d70ee9f84598\") " pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:10.759427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-kubelet\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.759427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-ovn\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.759427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.759427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/245c9064-c71a-4dce-bc16-4f45702063cf-etc-tuned\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.759427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-kubelet\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.759427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff83db54-9c7a-4cea-8c98-f941a157f101-host-slash\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759432 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-etc-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-node-log\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759471 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovnkube-config\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-modprobe-d\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759527 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-sys\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-cni-bin\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-kubelet\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-cni-bin\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-system-cni-dir\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-cni-binary-copy\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759705 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-etc-kubernetes\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ff83db54-9c7a-4cea-8c98-f941a157f101-iptables-alerter-script\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.759820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4e6f44f7-3183-4cd5-9000-e9662459d6af-tmp-dir\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759793 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759822 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovn-node-metrics-cert\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-lib-modules\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-cni-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-cni-bin\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759938 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-var-lib-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759974 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-sys\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.759999 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-systemd\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-etc-kubernetes\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-system-cni-dir\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-lib-modules\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f090e67f-4a19-4f45-a57b-d70ee9f84598-konnectivity-ca\") pod \"konnectivity-agent-bqh55\" (UID: \"f090e67f-4a19-4f45-a57b-d70ee9f84598\") " pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-cni-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-modprobe-d\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-system-cni-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-cnibin\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.760638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-system-cni-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-cnibin\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760276 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-systemd\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760319 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2cxl\" (UniqueName: \"kubernetes.io/projected/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-kube-api-access-s2cxl\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-systemd\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-log-socket\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-cnibin\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-cni-binary-copy\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-device-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnt5m\" (UniqueName: \"kubernetes.io/projected/4e6f44f7-3183-4cd5-9000-e9662459d6af-kube-api-access-pnt5m\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-run-netns\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-cni-netd\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s8mg\" (UniqueName: \"kubernetes.io/projected/73fda819-3b91-4892-92f0-995a9a9014c8-kube-api-access-9s8mg\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-hostroot\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e6f44f7-3183-4cd5-9000-e9662459d6af-hosts-file\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovnkube-script-lib\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.761520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f090e67f-4a19-4f45-a57b-d70ee9f84598-konnectivity-ca\") pod \"konnectivity-agent-bqh55\" (UID: \"f090e67f-4a19-4f45-a57b-d70ee9f84598\") " pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-daemon-config\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760856 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-cni-binary-copy\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-socket-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysctl-conf\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-device-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-os-release\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.760981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-hostroot\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-socket-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-cni-multus\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-conf-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761127 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-var-lib-cni-multus\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysctl-conf\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-cni-binary-copy\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-var-lib-kubelet\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761180 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-os-release\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-host\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-cnibin\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.762124 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761230 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-conf-dir\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-var-lib-kubelet\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-host\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-daemon-config\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761315 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73fda819-3b91-4892-92f0-995a9a9014c8-host\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761352 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73fda819-3b91-4892-92f0-995a9a9014c8-host\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-socket-dir-parent\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:10.761468 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761468 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761477 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-multus-socket-dir-parent\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/73fda819-3b91-4892-92f0-995a9a9014c8-serviceca\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:10.761549 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:44:11.261518719 +0000 UTC m=+2.986786087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-os-release\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/486c966c-4220-4865-a777-76f49bb4fa62-os-release\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.762977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-sys-fs\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-run\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-k8s-cni-cncf-io\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-netns\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-multus-certs\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jptnp\" (UniqueName: \"kubernetes.io/projected/ff83db54-9c7a-4cea-8c98-f941a157f101-kube-api-access-jptnp\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-slash\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-env-overrides\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysconfig\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-run\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysctl-d\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-systemd-units\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761896 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqvm8\" (UniqueName: \"kubernetes.io/projected/02dfb3c4-9530-4d2f-a953-075c7fc184b1-kube-api-access-bqvm8\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-k8s-cni-cncf-io\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-kubernetes\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761942 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqpv\" (UniqueName: \"kubernetes.io/projected/83681f84-53f3-489d-9b30-0db22fc1b40e-kube-api-access-xxqpv\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-netns\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.763636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-registration-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7mm2\" (UniqueName: \"kubernetes.io/projected/486c966c-4220-4865-a777-76f49bb4fa62-kube-api-access-w7mm2\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgz7l\" (UniqueName: \"kubernetes.io/projected/441d62c8-1471-4e50-af5c-5f5a36a868f7-kube-api-access-hgz7l\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/245c9064-c71a-4dce-bc16-4f45702063cf-tmp\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgvs7\" (UniqueName: \"kubernetes.io/projected/245c9064-c71a-4dce-bc16-4f45702063cf-kube-api-access-kgvs7\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762088 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysctl-d\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762086 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/486c966c-4220-4865-a777-76f49bb4fa62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-sys-fs\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-kubernetes\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/245c9064-c71a-4dce-bc16-4f45702063cf-etc-sysconfig\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.761991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-host-run-multus-certs\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-registration-dir\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.762316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/441d62c8-1471-4e50-af5c-5f5a36a868f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.763062 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/73fda819-3b91-4892-92f0-995a9a9014c8-serviceca\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.763344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/245c9064-c71a-4dce-bc16-4f45702063cf-etc-tuned\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.763606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f090e67f-4a19-4f45-a57b-d70ee9f84598-agent-certs\") pod \"konnectivity-agent-bqh55\" (UID: \"f090e67f-4a19-4f45-a57b-d70ee9f84598\") " pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:10.764339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.764228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/245c9064-c71a-4dce-bc16-4f45702063cf-tmp\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.766943 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.766897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2cxl\" (UniqueName: \"kubernetes.io/projected/691501fe-3ef5-4d58-bc4a-7ce8a3702e4d-kube-api-access-s2cxl\") pod \"multus-k7xlk\" (UID: \"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d\") " pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.767183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.767159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s8mg\" (UniqueName: \"kubernetes.io/projected/73fda819-3b91-4892-92f0-995a9a9014c8-kube-api-access-9s8mg\") pod \"node-ca-vh6vg\" (UID: \"73fda819-3b91-4892-92f0-995a9a9014c8\") " pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.773025 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.772987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgz7l\" (UniqueName: \"kubernetes.io/projected/441d62c8-1471-4e50-af5c-5f5a36a868f7-kube-api-access-hgz7l\") pod \"aws-ebs-csi-driver-node-xgvw7\" (UID: \"441d62c8-1471-4e50-af5c-5f5a36a868f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.773321 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.773298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgvs7\" (UniqueName: \"kubernetes.io/projected/245c9064-c71a-4dce-bc16-4f45702063cf-kube-api-access-kgvs7\") pod \"tuned-ncgcb\" (UID: \"245c9064-c71a-4dce-bc16-4f45702063cf\") " pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.773392 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.773298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqpv\" (UniqueName: \"kubernetes.io/projected/83681f84-53f3-489d-9b30-0db22fc1b40e-kube-api-access-xxqpv\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:10.773677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.773659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7mm2\" (UniqueName: \"kubernetes.io/projected/486c966c-4220-4865-a777-76f49bb4fa62-kube-api-access-w7mm2\") pod \"multus-additional-cni-plugins-66wwv\" (UID: \"486c966c-4220-4865-a777-76f49bb4fa62\") " pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.862427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-systemd\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.862427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-log-socket\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnt5m\" (UniqueName: \"kubernetes.io/projected/4e6f44f7-3183-4cd5-9000-e9662459d6af-kube-api-access-pnt5m\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-run-netns\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-cni-netd\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-systemd\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e6f44f7-3183-4cd5-9000-e9662459d6af-hosts-file\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-log-socket\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovnkube-script-lib\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-run-netns\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e6f44f7-3183-4cd5-9000-e9662459d6af-hosts-file\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.862656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-cni-netd\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jptnp\" (UniqueName: \"kubernetes.io/projected/ff83db54-9c7a-4cea-8c98-f941a157f101-kube-api-access-jptnp\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862762 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-slash\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-env-overrides\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-slash\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-systemd-units\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-systemd-units\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqvm8\" (UniqueName: \"kubernetes.io/projected/02dfb3c4-9530-4d2f-a953-075c7fc184b1-kube-api-access-bqvm8\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-ovn\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff83db54-9c7a-4cea-8c98-f941a157f101-host-slash\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.862988 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff83db54-9c7a-4cea-8c98-f941a157f101-host-slash\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863006 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-ovn\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-etc-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-node-log\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovnkube-config\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-env-overrides\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-kubelet\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-node-log\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-cni-bin\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ff83db54-9c7a-4cea-8c98-f941a157f101-iptables-alerter-script\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-kubelet\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-etc-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4e6f44f7-3183-4cd5-9000-e9662459d6af-tmp-dir\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-host-cni-bin\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863362 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovn-node-metrics-cert\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863415 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-var-lib-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-run-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.863858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4e6f44f7-3183-4cd5-9000-e9662459d6af-tmp-dir\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.864698 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863483 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02dfb3c4-9530-4d2f-a953-075c7fc184b1-var-lib-openvswitch\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.864698 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovnkube-config\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.864698 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ff83db54-9c7a-4cea-8c98-f941a157f101-iptables-alerter-script\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.864698 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.863919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovnkube-script-lib\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.865530 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.865506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02dfb3c4-9530-4d2f-a953-075c7fc184b1-ovn-node-metrics-cert\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.867749 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:10.867713 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:10.867749 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:10.867740 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:10.867749 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:10.867752 2572 projected.go:194] Error preparing data for projected volume kube-api-access-58m7g for pod openshift-network-diagnostics/network-check-target-4595v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:10.867976 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:10.867827 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g podName:3c8131bf-3394-4f77-956d-2b283e575873 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:11.367795773 +0000 UTC m=+3.093063107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-58m7g" (UniqueName: "kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g") pod "network-check-target-4595v" (UID: "3c8131bf-3394-4f77-956d-2b283e575873") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:10.870037 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.870016 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnt5m\" (UniqueName: \"kubernetes.io/projected/4e6f44f7-3183-4cd5-9000-e9662459d6af-kube-api-access-pnt5m\") pod \"node-resolver-kk8ft\" (UID: \"4e6f44f7-3183-4cd5-9000-e9662459d6af\") " pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.870199 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.870180 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptnp\" (UniqueName: \"kubernetes.io/projected/ff83db54-9c7a-4cea-8c98-f941a157f101-kube-api-access-jptnp\") pod \"iptables-alerter-542j9\" (UID: \"ff83db54-9c7a-4cea-8c98-f941a157f101\") " pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:10.870323 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.870305 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqvm8\" (UniqueName: \"kubernetes.io/projected/02dfb3c4-9530-4d2f-a953-075c7fc184b1-kube-api-access-bqvm8\") pod \"ovnkube-node-trqtd\" (UID: \"02dfb3c4-9530-4d2f-a953-075c7fc184b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:10.946320 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.946256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:10.953046 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.953027 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vh6vg" Apr 17 20:44:10.961662 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.961644 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-66wwv" Apr 17 20:44:10.967692 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.967672 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k7xlk" Apr 17 20:44:10.974267 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.974251 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" Apr 17 20:44:10.981836 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.981819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" Apr 17 20:44:10.988341 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.988324 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kk8ft" Apr 17 20:44:10.995864 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:10.995844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-542j9" Apr 17 20:44:11.000482 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.000462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:11.107476 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.107447 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:44:11.265827 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.265733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:11.265993 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:11.265882 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:11.265993 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:11.265954 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:44:12.265933788 +0000 UTC m=+3.991201132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:11.321448 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.321421 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73fda819_3b91_4892_92f0_995a9a9014c8.slice/crio-9c69abb9d177bcb8b8580a83d4e8776c18eec497c4c5ecb6a651fedf0b411fbb WatchSource:0}: Error finding container 9c69abb9d177bcb8b8580a83d4e8776c18eec497c4c5ecb6a651fedf0b411fbb: Status 404 returned error can't find the container with id 9c69abb9d177bcb8b8580a83d4e8776c18eec497c4c5ecb6a651fedf0b411fbb Apr 17 20:44:11.322678 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.322551 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441d62c8_1471_4e50_af5c_5f5a36a868f7.slice/crio-43afb9882f32e04ed396e164f3c60b7351cc06d0949b97161a565675371e8c1c WatchSource:0}: Error finding container 43afb9882f32e04ed396e164f3c60b7351cc06d0949b97161a565675371e8c1c: Status 404 returned error can't find the container with id 43afb9882f32e04ed396e164f3c60b7351cc06d0949b97161a565675371e8c1c Apr 17 20:44:11.323206 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.323174 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e6f44f7_3183_4cd5_9000_e9662459d6af.slice/crio-ac83031ae82c295b88ad929a785f25317451b1df639a0866fb2d47d8fe904675 WatchSource:0}: Error finding container ac83031ae82c295b88ad929a785f25317451b1df639a0866fb2d47d8fe904675: Status 404 returned error can't find the container with id ac83031ae82c295b88ad929a785f25317451b1df639a0866fb2d47d8fe904675 Apr 17 20:44:11.326142 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.326120 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod486c966c_4220_4865_a777_76f49bb4fa62.slice/crio-1231a58a4673c1d24073f5e18617535055f9a2a8b60a66072ae333915f1dd22d WatchSource:0}: Error finding container 1231a58a4673c1d24073f5e18617535055f9a2a8b60a66072ae333915f1dd22d: Status 404 returned error can't find the container with id 1231a58a4673c1d24073f5e18617535055f9a2a8b60a66072ae333915f1dd22d Apr 17 20:44:11.327284 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.327262 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod691501fe_3ef5_4d58_bc4a_7ce8a3702e4d.slice/crio-4f9f348e7d2b6c71c77a2c339c5a069b8b3af2454790e170255fb72d159ff9de WatchSource:0}: Error finding container 4f9f348e7d2b6c71c77a2c339c5a069b8b3af2454790e170255fb72d159ff9de: Status 404 returned error can't find the container with id 4f9f348e7d2b6c71c77a2c339c5a069b8b3af2454790e170255fb72d159ff9de Apr 17 20:44:11.327667 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.327629 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245c9064_c71a_4dce_bc16_4f45702063cf.slice/crio-48427d6d296bc14bcd02f9307987269170cc1b1371d7468d3b0f93853fd84515 WatchSource:0}: Error finding container 48427d6d296bc14bcd02f9307987269170cc1b1371d7468d3b0f93853fd84515: Status 404 returned error can't find the container with id 48427d6d296bc14bcd02f9307987269170cc1b1371d7468d3b0f93853fd84515 Apr 17 20:44:11.347476 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.347453 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff83db54_9c7a_4cea_8c98_f941a157f101.slice/crio-8b011726d3293c6c3656e8fbc30354f00a38468436dce86e95f0bb596f728302 WatchSource:0}: Error finding container 8b011726d3293c6c3656e8fbc30354f00a38468436dce86e95f0bb596f728302: Status 404 returned error can't find the container with id 8b011726d3293c6c3656e8fbc30354f00a38468436dce86e95f0bb596f728302 Apr 17 20:44:11.348411 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.348390 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf090e67f_4a19_4f45_a57b_d70ee9f84598.slice/crio-f64c22bd166b25cd7e83c8fd04c276bc62be41f4b36013035061d82d0faa0305 WatchSource:0}: Error finding container f64c22bd166b25cd7e83c8fd04c276bc62be41f4b36013035061d82d0faa0305: Status 404 returned error can't find the container with id f64c22bd166b25cd7e83c8fd04c276bc62be41f4b36013035061d82d0faa0305 Apr 17 20:44:11.349812 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:11.349782 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02dfb3c4_9530_4d2f_a953_075c7fc184b1.slice/crio-023c9fe1e6187d790a5f15b1496c2e4bcb88bf996c467b4c5f4fc51dda6281c9 WatchSource:0}: Error finding container 023c9fe1e6187d790a5f15b1496c2e4bcb88bf996c467b4c5f4fc51dda6281c9: Status 404 returned error can't find the container with id 023c9fe1e6187d790a5f15b1496c2e4bcb88bf996c467b4c5f4fc51dda6281c9 Apr 17 20:44:11.467816 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.467773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:11.467927 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:11.467915 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:11.467990 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:11.467936 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:11.467990 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:11.467949 2572 projected.go:194] Error preparing data for projected volume kube-api-access-58m7g for pod openshift-network-diagnostics/network-check-target-4595v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:11.468086 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:11.468004 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g podName:3c8131bf-3394-4f77-956d-2b283e575873 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:12.467983776 +0000 UTC m=+4.193251128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-58m7g" (UniqueName: "kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g") pod "network-check-target-4595v" (UID: "3c8131bf-3394-4f77-956d-2b283e575873") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:11.685185 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.685111 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:39:09 +0000 UTC" deadline="2027-12-11 11:39:04.741494228 +0000 UTC" Apr 17 20:44:11.685185 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.685142 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14462h54m53.05635542s" Apr 17 20:44:11.718841 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.718744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bqh55" event={"ID":"f090e67f-4a19-4f45-a57b-d70ee9f84598","Type":"ContainerStarted","Data":"f64c22bd166b25cd7e83c8fd04c276bc62be41f4b36013035061d82d0faa0305"} Apr 17 20:44:11.720031 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.719986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-542j9" event={"ID":"ff83db54-9c7a-4cea-8c98-f941a157f101","Type":"ContainerStarted","Data":"8b011726d3293c6c3656e8fbc30354f00a38468436dce86e95f0bb596f728302"} Apr 17 20:44:11.721348 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.721307 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k7xlk" event={"ID":"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d","Type":"ContainerStarted","Data":"4f9f348e7d2b6c71c77a2c339c5a069b8b3af2454790e170255fb72d159ff9de"} Apr 17 20:44:11.723563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.723507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" event={"ID":"245c9064-c71a-4dce-bc16-4f45702063cf","Type":"ContainerStarted","Data":"48427d6d296bc14bcd02f9307987269170cc1b1371d7468d3b0f93853fd84515"} Apr 17 20:44:11.726302 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.725677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kk8ft" event={"ID":"4e6f44f7-3183-4cd5-9000-e9662459d6af","Type":"ContainerStarted","Data":"ac83031ae82c295b88ad929a785f25317451b1df639a0866fb2d47d8fe904675"} Apr 17 20:44:11.727297 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.727273 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vh6vg" event={"ID":"73fda819-3b91-4892-92f0-995a9a9014c8","Type":"ContainerStarted","Data":"9c69abb9d177bcb8b8580a83d4e8776c18eec497c4c5ecb6a651fedf0b411fbb"} Apr 17 20:44:11.730668 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.730347 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" event={"ID":"7a467953d27af2c37f58628655d1416c","Type":"ContainerStarted","Data":"78ceca4ac236b6030767656a3c79c8a552916902c38a2862c7bbe854681ae885"} Apr 17 20:44:11.733959 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.733936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"023c9fe1e6187d790a5f15b1496c2e4bcb88bf996c467b4c5f4fc51dda6281c9"} Apr 17 20:44:11.737391 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.737368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66wwv" event={"ID":"486c966c-4220-4865-a777-76f49bb4fa62","Type":"ContainerStarted","Data":"1231a58a4673c1d24073f5e18617535055f9a2a8b60a66072ae333915f1dd22d"} Apr 17 20:44:11.741244 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:11.740954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" event={"ID":"441d62c8-1471-4e50-af5c-5f5a36a868f7","Type":"ContainerStarted","Data":"43afb9882f32e04ed396e164f3c60b7351cc06d0949b97161a565675371e8c1c"} Apr 17 20:44:12.273993 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:12.273894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:12.274154 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:12.274072 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:12.274154 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:12.274136 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:44:14.274117336 +0000 UTC m=+5.999384677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:12.475487 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:12.475377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:12.475653 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:12.475532 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:12.475653 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:12.475552 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:12.475653 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:12.475566 2572 projected.go:194] Error preparing data for projected volume kube-api-access-58m7g for pod openshift-network-diagnostics/network-check-target-4595v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:12.475653 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:12.475623 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g podName:3c8131bf-3394-4f77-956d-2b283e575873 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:14.475605001 +0000 UTC m=+6.200872342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-58m7g" (UniqueName: "kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g") pod "network-check-target-4595v" (UID: "3c8131bf-3394-4f77-956d-2b283e575873") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:12.713829 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:12.712910 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:12.713829 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:12.713078 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:12.713829 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:12.713584 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:12.713829 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:12.713695 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:12.764445 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:12.764403 2572 generic.go:358] "Generic (PLEG): container finished" podID="e7435db675ee99a271ee76030c18b240" containerID="f250cc33b4d7b7785358e3211a846d8c6ed8dba49a9c16c00de3eaedd376a7f8" exitCode=0 Apr 17 20:44:12.764608 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:12.764545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" event={"ID":"e7435db675ee99a271ee76030c18b240","Type":"ContainerDied","Data":"f250cc33b4d7b7785358e3211a846d8c6ed8dba49a9c16c00de3eaedd376a7f8"} Apr 17 20:44:12.777008 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:12.776087 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-102.ec2.internal" podStartSLOduration=3.776069669 podStartE2EDuration="3.776069669s" podCreationTimestamp="2026-04-17 20:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:44:11.743192567 +0000 UTC m=+3.468459926" watchObservedRunningTime="2026-04-17 20:44:12.776069669 +0000 UTC m=+4.501337026" Apr 17 20:44:13.775422 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:13.775385 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" event={"ID":"e7435db675ee99a271ee76030c18b240","Type":"ContainerStarted","Data":"c0d12c6e44e0bbb8d10fdc5c04d5fe733aee2f8f799f204db7e64e322a0a0798"} Apr 17 20:44:14.290532 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:14.289953 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:14.290532 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:14.290114 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:14.290532 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:14.290184 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:44:18.29016384 +0000 UTC m=+10.015431181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:14.491029 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:14.490983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:14.491180 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:14.491149 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:14.491180 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:14.491169 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:14.491283 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:14.491183 2572 projected.go:194] Error preparing data for projected volume kube-api-access-58m7g for pod openshift-network-diagnostics/network-check-target-4595v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:14.491283 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:14.491245 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g podName:3c8131bf-3394-4f77-956d-2b283e575873 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:18.49122424 +0000 UTC m=+10.216491596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-58m7g" (UniqueName: "kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g") pod "network-check-target-4595v" (UID: "3c8131bf-3394-4f77-956d-2b283e575873") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:14.709015 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:14.708933 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:14.709227 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:14.709068 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:14.709227 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:14.708944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:14.709227 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:14.709154 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:16.708384 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:16.708351 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:16.708853 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:16.708386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:16.708853 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:16.708490 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:16.708853 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:16.708622 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:18.325335 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:18.325294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:18.325782 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:18.325504 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:18.325782 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:18.325578 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:44:26.325556622 +0000 UTC m=+18.050823964 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:18.527040 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:18.526995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:18.527237 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:18.527219 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:18.527317 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:18.527242 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:18.527317 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:18.527255 2572 projected.go:194] Error preparing data for projected volume kube-api-access-58m7g for pod openshift-network-diagnostics/network-check-target-4595v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:18.527421 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:18.527318 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g podName:3c8131bf-3394-4f77-956d-2b283e575873 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:26.527297796 +0000 UTC m=+18.252565154 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-58m7g" (UniqueName: "kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g") pod "network-check-target-4595v" (UID: "3c8131bf-3394-4f77-956d-2b283e575873") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:18.711605 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:18.711523 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:18.711771 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:18.711646 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:18.712252 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:18.712059 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:18.712252 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:18.712225 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:20.708665 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:20.708633 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:20.708665 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:20.708653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:20.709272 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:20.708769 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:20.709272 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:20.708878 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:22.708678 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:22.708641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:22.709117 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:22.708764 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:22.709117 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:22.708836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:22.709117 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:22.708936 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:24.708975 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:24.708939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:24.709386 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:24.708939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:24.709386 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:24.709069 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:24.709386 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:24.709178 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:26.383773 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:26.383733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:26.384247 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:26.383902 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:26.384247 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:26.383975 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:44:42.383955843 +0000 UTC m=+34.109223181 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:26.585662 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:26.585629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:26.585841 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:26.585758 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:26.585841 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:26.585774 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:26.585841 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:26.585784 2572 projected.go:194] Error preparing data for projected volume kube-api-access-58m7g for pod openshift-network-diagnostics/network-check-target-4595v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:26.585970 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:26.585846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g podName:3c8131bf-3394-4f77-956d-2b283e575873 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:42.585832717 +0000 UTC m=+34.311100061 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-58m7g" (UniqueName: "kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g") pod "network-check-target-4595v" (UID: "3c8131bf-3394-4f77-956d-2b283e575873") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:26.708823 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:26.708726 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:26.708823 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:26.708756 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:26.709012 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:26.708877 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:26.709052 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:26.709006 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:28.709354 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.709190 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:28.710099 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:28.709426 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:28.710099 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.709283 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:28.710099 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:28.709534 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:28.799918 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.799881 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" event={"ID":"245c9064-c71a-4dce-bc16-4f45702063cf","Type":"ContainerStarted","Data":"b7238514d06cee27c390963b985ab55af9f10360142bbcba0d57e78ddad7e5a7"} Apr 17 20:44:28.801185 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.801160 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kk8ft" event={"ID":"4e6f44f7-3183-4cd5-9000-e9662459d6af","Type":"ContainerStarted","Data":"3684f34cc948ca64ad42ac2927e21d3fa2db639c036fc1e4c0d2f08c5792d1e4"} Apr 17 20:44:28.802436 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.802415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vh6vg" event={"ID":"73fda819-3b91-4892-92f0-995a9a9014c8","Type":"ContainerStarted","Data":"f20b02e3147ac16230e54e9fe6763174a751e3581746d5d394eb2fd2c23789a0"} Apr 17 20:44:28.803951 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.803932 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:44:28.804244 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.804222 2572 generic.go:358] "Generic (PLEG): container finished" podID="02dfb3c4-9530-4d2f-a953-075c7fc184b1" containerID="37d71cbbe3eeea97f72f9a495ac81fcbbf5c4219139ed2969f4c061484523da3" exitCode=1 Apr 17 20:44:28.804312 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.804278 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"06473afeae68241fea73e484a357cd61f5406c586cf4e2aef7de40a6d7ed9e30"} Apr 17 20:44:28.804312 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.804295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerDied","Data":"37d71cbbe3eeea97f72f9a495ac81fcbbf5c4219139ed2969f4c061484523da3"} Apr 17 20:44:28.804312 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.804306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"b987cd246c669695bd60d5b531680dc67007b98330f273d9b5299706a0084973"} Apr 17 20:44:28.805531 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.805511 2572 generic.go:358] "Generic (PLEG): container finished" podID="486c966c-4220-4865-a777-76f49bb4fa62" containerID="7a12d486fb92272f293081af4b25f8ceb1e976b332859690378796f3214c27cc" exitCode=0 Apr 17 20:44:28.805612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.805537 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66wwv" event={"ID":"486c966c-4220-4865-a777-76f49bb4fa62","Type":"ContainerDied","Data":"7a12d486fb92272f293081af4b25f8ceb1e976b332859690378796f3214c27cc"} Apr 17 20:44:28.806976 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.806944 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" event={"ID":"441d62c8-1471-4e50-af5c-5f5a36a868f7","Type":"ContainerStarted","Data":"5b928180fa1d35631a7af5fa0044bea13332380511d51611af9428c213ec25de"} Apr 17 20:44:28.808268 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.808249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bqh55" event={"ID":"f090e67f-4a19-4f45-a57b-d70ee9f84598","Type":"ContainerStarted","Data":"8f8637c0070887fbc25969959abeeeb2326320bf4df91853a35b524e2aee787e"} Apr 17 20:44:28.809455 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.809435 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k7xlk" event={"ID":"691501fe-3ef5-4d58-bc4a-7ce8a3702e4d","Type":"ContainerStarted","Data":"60dd503b5021df3297fdeb2e8194d94c87577375065245fcdf015a9dbfdaba3f"} Apr 17 20:44:28.816461 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.816423 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-102.ec2.internal" podStartSLOduration=19.816413331 podStartE2EDuration="19.816413331s" podCreationTimestamp="2026-04-17 20:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:44:13.788229132 +0000 UTC m=+5.513496491" watchObservedRunningTime="2026-04-17 20:44:28.816413331 +0000 UTC m=+20.541680688" Apr 17 20:44:28.816737 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.816712 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ncgcb" podStartSLOduration=3.917851062 podStartE2EDuration="20.816704165s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.346640191 +0000 UTC m=+3.071907527" lastFinishedPulling="2026-04-17 20:44:28.245493294 +0000 UTC m=+19.970760630" observedRunningTime="2026-04-17 20:44:28.816283391 +0000 UTC m=+20.541550761" watchObservedRunningTime="2026-04-17 20:44:28.816704165 +0000 UTC m=+20.541971572" Apr 17 20:44:28.827345 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.827307 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vh6vg" podStartSLOduration=8.619191553 podStartE2EDuration="20.82729523s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.323375662 +0000 UTC m=+3.048643015" lastFinishedPulling="2026-04-17 20:44:23.531479353 +0000 UTC m=+15.256746692" observedRunningTime="2026-04-17 20:44:28.826863542 +0000 UTC m=+20.552130900" watchObservedRunningTime="2026-04-17 20:44:28.82729523 +0000 UTC m=+20.552562633" Apr 17 20:44:28.858354 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.858307 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-k7xlk" podStartSLOduration=3.944330923 podStartE2EDuration="20.858293932s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.346605216 +0000 UTC m=+3.071872554" lastFinishedPulling="2026-04-17 20:44:28.260568224 +0000 UTC m=+19.985835563" observedRunningTime="2026-04-17 20:44:28.857629637 +0000 UTC m=+20.582897020" watchObservedRunningTime="2026-04-17 20:44:28.858293932 +0000 UTC m=+20.583561279" Apr 17 20:44:28.871565 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.871527 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kk8ft" podStartSLOduration=3.996270777 podStartE2EDuration="20.871512617s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.325074695 +0000 UTC m=+3.050342032" lastFinishedPulling="2026-04-17 20:44:28.200316529 +0000 UTC m=+19.925583872" observedRunningTime="2026-04-17 20:44:28.871384895 +0000 UTC m=+20.596652252" watchObservedRunningTime="2026-04-17 20:44:28.871512617 +0000 UTC m=+20.596779973" Apr 17 20:44:28.888755 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:28.888711 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bqh55" podStartSLOduration=3.995592452 podStartE2EDuration="20.888696957s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.352860332 +0000 UTC m=+3.078127669" lastFinishedPulling="2026-04-17 20:44:28.245964838 +0000 UTC m=+19.971232174" observedRunningTime="2026-04-17 20:44:28.888532135 +0000 UTC m=+20.613799503" watchObservedRunningTime="2026-04-17 20:44:28.888696957 +0000 UTC m=+20.613964370" Apr 17 20:44:29.503740 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.503535 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:29.504219 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.504192 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:29.815707 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.815687 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:44:29.816390 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.816316 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"1b45ce0267d8e8c31b37be3104d9f5d4cfaf3c31919aca511cfeb3703649f611"} Apr 17 20:44:29.816390 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.816355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"abcfccab87196fef9573a7da1aac3760f6292ffdf6f23fb8afba5abfc1e6a841"} Apr 17 20:44:29.816390 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.816372 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"9732c59cdb92ee9e115104241cf6ce0a43d491485ce4f67458b7da323b9bdb0b"} Apr 17 20:44:29.816946 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.816919 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:29.817167 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.817151 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bqh55" Apr 17 20:44:29.927840 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:29.927793 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:44:30.708982 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:30.708953 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:30.709195 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:30.708998 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:30.709195 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:30.709085 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:30.709195 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:30.709173 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:30.723415 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:30.723316 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:44:29.927825298Z","UUID":"1b49a686-caa5-4cda-9919-a558c5d3757b","Handler":null,"Name":"","Endpoint":""} Apr 17 20:44:30.725137 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:30.725110 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:44:30.725247 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:30.725154 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:44:30.819116 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:30.819082 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" event={"ID":"441d62c8-1471-4e50-af5c-5f5a36a868f7","Type":"ContainerStarted","Data":"b31f2edc1aef97a90f451f82ccf9fc7620b2bf719ab8337efa5a13794155791a"} Apr 17 20:44:30.821820 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:30.821781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-542j9" event={"ID":"ff83db54-9c7a-4cea-8c98-f941a157f101","Type":"ContainerStarted","Data":"5e027ecaec0cbf48ee5266dc482df263537294f1f6bb7998c30de9f3f31d87cd"} Apr 17 20:44:30.834386 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:30.834343 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-542j9" podStartSLOduration=5.936185539 podStartE2EDuration="22.834325896s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.349163644 +0000 UTC m=+3.074430982" lastFinishedPulling="2026-04-17 20:44:28.247303991 +0000 UTC m=+19.972571339" observedRunningTime="2026-04-17 20:44:30.834076204 +0000 UTC m=+22.559343561" watchObservedRunningTime="2026-04-17 20:44:30.834325896 +0000 UTC m=+22.559593253" Apr 17 20:44:31.828973 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:31.828935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" event={"ID":"441d62c8-1471-4e50-af5c-5f5a36a868f7","Type":"ContainerStarted","Data":"9478dd6c6845804a2facf08c2e76d318570a60b17437ee75b46212e2963c7bf6"} Apr 17 20:44:31.831836 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:31.831813 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:44:31.832228 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:31.832205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"b9de5438ca98716f952aa39891ef213cd85a308cb7cd3b5bab288d614bc8ce43"} Apr 17 20:44:31.844289 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:31.844248 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xgvw7" podStartSLOduration=4.20536124 podStartE2EDuration="23.844236274s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.325179584 +0000 UTC m=+3.050446921" lastFinishedPulling="2026-04-17 20:44:30.964054604 +0000 UTC m=+22.689321955" observedRunningTime="2026-04-17 20:44:31.843902531 +0000 UTC m=+23.569169888" watchObservedRunningTime="2026-04-17 20:44:31.844236274 +0000 UTC m=+23.569503631" Apr 17 20:44:32.708286 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:32.708251 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:32.708453 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:32.708371 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:32.708453 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:32.708431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:32.708554 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:32.708531 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:34.708245 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.708023 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:34.708878 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.708073 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:34.708878 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:34.708270 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:34.708878 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:34.708324 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:34.840250 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.840225 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:44:34.840553 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.840534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"8714891cdffba50e720fb26a4e44fa1246f2d06597e3a98a0e086131eeb1e09f"} Apr 17 20:44:34.840839 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.840815 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:34.840839 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.840841 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:34.841007 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.840990 2572 scope.go:117] "RemoveContainer" containerID="37d71cbbe3eeea97f72f9a495ac81fcbbf5c4219139ed2969f4c061484523da3" Apr 17 20:44:34.842223 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.842199 2572 generic.go:358] "Generic (PLEG): container finished" podID="486c966c-4220-4865-a777-76f49bb4fa62" containerID="593293d7b4380eef1f4e72e80b096ba172d676845b12a7b6c1cb7b19b3c85d86" exitCode=0 Apr 17 20:44:34.842309 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.842245 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66wwv" event={"ID":"486c966c-4220-4865-a777-76f49bb4fa62","Type":"ContainerDied","Data":"593293d7b4380eef1f4e72e80b096ba172d676845b12a7b6c1cb7b19b3c85d86"} Apr 17 20:44:34.856269 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:34.856251 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:35.504871 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.504839 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gk7jn"] Apr 17 20:44:35.506500 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.506481 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.506623 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:35.506549 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gk7jn" podUID="57c365ae-f2db-4533-9e62-b1193ccbe5c8" Apr 17 20:44:35.657797 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.657750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/57c365ae-f2db-4533-9e62-b1193ccbe5c8-dbus\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.657994 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.657890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.657994 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.657971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/57c365ae-f2db-4533-9e62-b1193ccbe5c8-kubelet-config\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.758530 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.758501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/57c365ae-f2db-4533-9e62-b1193ccbe5c8-kubelet-config\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.758973 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.758539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/57c365ae-f2db-4533-9e62-b1193ccbe5c8-dbus\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.758973 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.758587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.758973 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.758625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/57c365ae-f2db-4533-9e62-b1193ccbe5c8-kubelet-config\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.758973 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:35.758703 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:35.758973 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:35.758765 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret podName:57c365ae-f2db-4533-9e62-b1193ccbe5c8 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:36.258748699 +0000 UTC m=+27.984016049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret") pod "global-pull-secret-syncer-gk7jn" (UID: "57c365ae-f2db-4533-9e62-b1193ccbe5c8") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:35.758973 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.758814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/57c365ae-f2db-4533-9e62-b1193ccbe5c8-dbus\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:35.848307 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.848281 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:44:35.848690 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.848658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" event={"ID":"02dfb3c4-9530-4d2f-a953-075c7fc184b1","Type":"ContainerStarted","Data":"bdb2d8bdfdd464ab5cbe4f63a13958e6bd0ccc3a414d792ed00d52f92b8b23b9"} Apr 17 20:44:35.849107 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.849081 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:35.851489 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.851463 2572 generic.go:358] "Generic (PLEG): container finished" podID="486c966c-4220-4865-a777-76f49bb4fa62" containerID="610cce3f751ff12be333836332e3020a9fb5d47c9739acc0187a6f690427e7b1" exitCode=0 Apr 17 20:44:35.851581 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.851522 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66wwv" event={"ID":"486c966c-4220-4865-a777-76f49bb4fa62","Type":"ContainerDied","Data":"610cce3f751ff12be333836332e3020a9fb5d47c9739acc0187a6f690427e7b1"} Apr 17 20:44:35.865454 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.865299 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:44:35.874028 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:35.873987 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" podStartSLOduration=10.929688067 podStartE2EDuration="27.873976844s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.352833094 +0000 UTC m=+3.078100446" lastFinishedPulling="2026-04-17 20:44:28.297121888 +0000 UTC m=+20.022389223" observedRunningTime="2026-04-17 20:44:35.87292277 +0000 UTC m=+27.598190130" watchObservedRunningTime="2026-04-17 20:44:35.873976844 +0000 UTC m=+27.599244200" Apr 17 20:44:36.104382 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.104307 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gk7jn"] Apr 17 20:44:36.104541 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.104450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:36.104583 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:36.104548 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gk7jn" podUID="57c365ae-f2db-4533-9e62-b1193ccbe5c8" Apr 17 20:44:36.106903 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.106872 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4595v"] Apr 17 20:44:36.107025 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.106959 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:36.107092 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:36.107023 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:36.115196 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.115175 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gggxp"] Apr 17 20:44:36.115270 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.115259 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:36.115356 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:36.115341 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:36.263476 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.263443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:36.263633 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:36.263588 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:36.263679 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:36.263651 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret podName:57c365ae-f2db-4533-9e62-b1193ccbe5c8 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:37.263637169 +0000 UTC m=+28.988904525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret") pod "global-pull-secret-syncer-gk7jn" (UID: "57c365ae-f2db-4533-9e62-b1193ccbe5c8") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:36.854618 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.854541 2572 generic.go:358] "Generic (PLEG): container finished" podID="486c966c-4220-4865-a777-76f49bb4fa62" containerID="26bec8aad78760f5490899df7088e9661240e761bebbc7dd5d817986d626ba8b" exitCode=0 Apr 17 20:44:36.855175 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:36.854624 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66wwv" event={"ID":"486c966c-4220-4865-a777-76f49bb4fa62","Type":"ContainerDied","Data":"26bec8aad78760f5490899df7088e9661240e761bebbc7dd5d817986d626ba8b"} Apr 17 20:44:37.271585 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:37.271489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:37.271743 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:37.271621 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:37.271743 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:37.271674 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret podName:57c365ae-f2db-4533-9e62-b1193ccbe5c8 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:39.271661268 +0000 UTC m=+30.996928608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret") pod "global-pull-secret-syncer-gk7jn" (UID: "57c365ae-f2db-4533-9e62-b1193ccbe5c8") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:37.708813 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:37.708760 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:37.708957 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:37.708760 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:37.708957 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:37.708917 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gk7jn" podUID="57c365ae-f2db-4533-9e62-b1193ccbe5c8" Apr 17 20:44:37.709073 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:37.708760 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:37.709073 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:37.709008 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:37.709159 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:37.709067 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:39.288052 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:39.288013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:39.288673 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:39.288170 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:39.288673 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:39.288233 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret podName:57c365ae-f2db-4533-9e62-b1193ccbe5c8 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:43.288216736 +0000 UTC m=+35.013484086 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret") pod "global-pull-secret-syncer-gk7jn" (UID: "57c365ae-f2db-4533-9e62-b1193ccbe5c8") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:39.708548 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:39.708471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:39.708708 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:39.708471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:39.708708 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:39.708591 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:44:39.708708 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:39.708629 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4595v" podUID="3c8131bf-3394-4f77-956d-2b283e575873" Apr 17 20:44:39.708708 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:39.708471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:39.708708 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:39.708682 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gk7jn" podUID="57c365ae-f2db-4533-9e62-b1193ccbe5c8" Apr 17 20:44:41.118256 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.118227 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-102.ec2.internal" event="NodeReady" Apr 17 20:44:41.118624 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.118362 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:44:41.149354 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.149321 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-cd894dd8c-8msng"] Apr 17 20:44:41.154183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.154161 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.156481 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.156452 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:44:41.156583 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.156554 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6qbmr\"" Apr 17 20:44:41.156745 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.156706 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:44:41.156745 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.156730 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:44:41.162460 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.161863 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:44:41.163123 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.163067 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-cd894dd8c-8msng"] Apr 17 20:44:41.163680 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.163662 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zbsnq"] Apr 17 20:44:41.166706 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.166691 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6qvxz"] Apr 17 20:44:41.166889 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.166873 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.168619 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.168593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:44:41.168711 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.168594 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:44:41.168711 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.168702 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8pvd4\"" Apr 17 20:44:41.170363 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.170346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:41.172129 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.172109 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:44:41.172285 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.172248 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pn5wd\"" Apr 17 20:44:41.172370 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.172361 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:44:41.172425 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.172367 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:44:41.174498 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.174412 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6qvxz"] Apr 17 20:44:41.175142 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.175112 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zbsnq"] Apr 17 20:44:41.302697 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.302653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c75b554f-1463-4bda-8049-f6e4988ffef7-tmp-dir\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.302885 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.302706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw7f5\" (UniqueName: \"kubernetes.io/projected/79374dd1-1272-4edf-9d10-449bca8feb97-kube-api-access-kw7f5\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:41.302885 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.302750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.302885 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.302774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-installation-pull-secrets\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.302885 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.302814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c75b554f-1463-4bda-8049-f6e4988ffef7-config-volume\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.302885 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.302874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-trusted-ca\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.303170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.302913 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:41.303170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.302971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.303170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.303003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-registry-certificates\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.303170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.303029 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d441e83e-a15a-4717-ba02-8beb555aad02-ca-trust-extracted\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.303170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.303050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rblzb\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-kube-api-access-rblzb\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.303170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.303064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7rbp\" (UniqueName: \"kubernetes.io/projected/c75b554f-1463-4bda-8049-f6e4988ffef7-kube-api-access-h7rbp\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.303170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.303110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-image-registry-private-configuration\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.303170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.303135 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-bound-sa-token\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404023 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.403987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.404188 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-registry-certificates\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404188 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d441e83e-a15a-4717-ba02-8beb555aad02-ca-trust-extracted\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404188 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rblzb\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-kube-api-access-rblzb\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404356 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7rbp\" (UniqueName: \"kubernetes.io/projected/c75b554f-1463-4bda-8049-f6e4988ffef7-kube-api-access-h7rbp\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.404356 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-image-registry-private-configuration\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404356 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-bound-sa-token\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404356 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.404316 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:41.404542 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.404449 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls podName:c75b554f-1463-4bda-8049-f6e4988ffef7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:41.904428682 +0000 UTC m=+33.629696043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls") pod "dns-default-zbsnq" (UID: "c75b554f-1463-4bda-8049-f6e4988ffef7") : secret "dns-default-metrics-tls" not found Apr 17 20:44:41.404542 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c75b554f-1463-4bda-8049-f6e4988ffef7-tmp-dir\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.404542 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kw7f5\" (UniqueName: \"kubernetes.io/projected/79374dd1-1272-4edf-9d10-449bca8feb97-kube-api-access-kw7f5\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:41.404700 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404543 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404700 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-installation-pull-secrets\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404700 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c75b554f-1463-4bda-8049-f6e4988ffef7-config-volume\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.404700 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-trusted-ca\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.404931 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:41.404931 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.404908 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:41.404931 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c75b554f-1463-4bda-8049-f6e4988ffef7-tmp-dir\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.405084 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.404940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d441e83e-a15a-4717-ba02-8beb555aad02-ca-trust-extracted\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.405084 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.404960 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert podName:79374dd1-1272-4edf-9d10-449bca8feb97 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:41.904945396 +0000 UTC m=+33.630212749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert") pod "ingress-canary-6qvxz" (UID: "79374dd1-1272-4edf-9d10-449bca8feb97") : secret "canary-serving-cert" not found Apr 17 20:44:41.405084 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.405020 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:41.405084 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.405035 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd894dd8c-8msng: secret "image-registry-tls" not found Apr 17 20:44:41.405281 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.405150 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls podName:d441e83e-a15a-4717-ba02-8beb555aad02 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:41.905135287 +0000 UTC m=+33.630402624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls") pod "image-registry-cd894dd8c-8msng" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02") : secret "image-registry-tls" not found Apr 17 20:44:41.405281 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.405247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-registry-certificates\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.405410 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.405388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c75b554f-1463-4bda-8049-f6e4988ffef7-config-volume\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.405938 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.405878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-trusted-ca\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.409836 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.409692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-image-registry-private-configuration\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.409923 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.409711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-installation-pull-secrets\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.415269 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.415226 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw7f5\" (UniqueName: \"kubernetes.io/projected/79374dd1-1272-4edf-9d10-449bca8feb97-kube-api-access-kw7f5\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:41.415413 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.415369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7rbp\" (UniqueName: \"kubernetes.io/projected/c75b554f-1463-4bda-8049-f6e4988ffef7-kube-api-access-h7rbp\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.415413 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.415402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rblzb\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-kube-api-access-rblzb\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.415691 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.415675 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-bound-sa-token\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.708991 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.708889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:41.709157 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.709019 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:41.709157 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.709029 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:41.710834 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.710817 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:44:41.710947 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.710915 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:44:41.710947 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.710927 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jzn7k\"" Apr 17 20:44:41.711049 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.711035 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:44:41.711088 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.711077 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:44:41.711437 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.711420 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nb4cm\"" Apr 17 20:44:41.908189 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.908153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:41.908374 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.908199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:41.908374 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:41.908252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:41.908374 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.908316 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:41.908374 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.908335 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd894dd8c-8msng: secret "image-registry-tls" not found Apr 17 20:44:41.908374 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.908355 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:41.908625 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.908355 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:41.908625 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.908393 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls podName:d441e83e-a15a-4717-ba02-8beb555aad02 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:42.908373606 +0000 UTC m=+34.633640947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls") pod "image-registry-cd894dd8c-8msng" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02") : secret "image-registry-tls" not found Apr 17 20:44:41.908625 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.908414 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert podName:79374dd1-1272-4edf-9d10-449bca8feb97 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:42.908405266 +0000 UTC m=+34.633672606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert") pod "ingress-canary-6qvxz" (UID: "79374dd1-1272-4edf-9d10-449bca8feb97") : secret "canary-serving-cert" not found Apr 17 20:44:41.908625 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:41.908431 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls podName:c75b554f-1463-4bda-8049-f6e4988ffef7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:42.908420695 +0000 UTC m=+34.633688035 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls") pod "dns-default-zbsnq" (UID: "c75b554f-1463-4bda-8049-f6e4988ffef7") : secret "dns-default-metrics-tls" not found Apr 17 20:44:42.410831 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:42.410779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:44:42.411277 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.410939 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:44:42.411277 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.411022 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:45:14.410999876 +0000 UTC m=+66.136267229 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : secret "metrics-daemon-secret" not found Apr 17 20:44:42.612353 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:42.612316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:42.615382 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:42.615360 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58m7g\" (UniqueName: \"kubernetes.io/projected/3c8131bf-3394-4f77-956d-2b283e575873-kube-api-access-58m7g\") pod \"network-check-target-4595v\" (UID: \"3c8131bf-3394-4f77-956d-2b283e575873\") " pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:42.629464 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:42.629442 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:42.914943 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:42.914907 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:42.915128 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:42.914978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:42.915128 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:42.915009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:42.915128 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.915072 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:42.915287 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.915132 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:42.915287 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.915136 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:42.915287 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.915156 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd894dd8c-8msng: secret "image-registry-tls" not found Apr 17 20:44:42.915287 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.915144 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls podName:c75b554f-1463-4bda-8049-f6e4988ffef7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:44.91512372 +0000 UTC m=+36.640391077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls") pod "dns-default-zbsnq" (UID: "c75b554f-1463-4bda-8049-f6e4988ffef7") : secret "dns-default-metrics-tls" not found Apr 17 20:44:42.915287 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.915214 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert podName:79374dd1-1272-4edf-9d10-449bca8feb97 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:44.915197706 +0000 UTC m=+36.640465047 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert") pod "ingress-canary-6qvxz" (UID: "79374dd1-1272-4edf-9d10-449bca8feb97") : secret "canary-serving-cert" not found Apr 17 20:44:42.915287 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:42.915232 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls podName:d441e83e-a15a-4717-ba02-8beb555aad02 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:44.915221668 +0000 UTC m=+36.640489003 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls") pod "image-registry-cd894dd8c-8msng" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02") : secret "image-registry-tls" not found Apr 17 20:44:43.317991 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:43.317951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:43.320388 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:43.320361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/57c365ae-f2db-4533-9e62-b1193ccbe5c8-original-pull-secret\") pod \"global-pull-secret-syncer-gk7jn\" (UID: \"57c365ae-f2db-4533-9e62-b1193ccbe5c8\") " pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:43.518215 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:43.518175 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gk7jn" Apr 17 20:44:44.334390 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.334367 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4595v"] Apr 17 20:44:44.336436 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.336413 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gk7jn"] Apr 17 20:44:44.338337 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:44.338304 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c8131bf_3394_4f77_956d_2b283e575873.slice/crio-50e21b362ba13e42ca9895bb145f6a43eb672c33541c0edce07c5b4afa4868cb WatchSource:0}: Error finding container 50e21b362ba13e42ca9895bb145f6a43eb672c33541c0edce07c5b4afa4868cb: Status 404 returned error can't find the container with id 50e21b362ba13e42ca9895bb145f6a43eb672c33541c0edce07c5b4afa4868cb Apr 17 20:44:44.339850 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:44:44.339823 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57c365ae_f2db_4533_9e62_b1193ccbe5c8.slice/crio-c471ad36eb6797fbc65aaa9c36a8bd6204b0802bb16f94776e8bd474a034ab89 WatchSource:0}: Error finding container c471ad36eb6797fbc65aaa9c36a8bd6204b0802bb16f94776e8bd474a034ab89: Status 404 returned error can't find the container with id c471ad36eb6797fbc65aaa9c36a8bd6204b0802bb16f94776e8bd474a034ab89 Apr 17 20:44:44.871745 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.871496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gk7jn" event={"ID":"57c365ae-f2db-4533-9e62-b1193ccbe5c8","Type":"ContainerStarted","Data":"c471ad36eb6797fbc65aaa9c36a8bd6204b0802bb16f94776e8bd474a034ab89"} Apr 17 20:44:44.872714 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.872688 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4595v" event={"ID":"3c8131bf-3394-4f77-956d-2b283e575873","Type":"ContainerStarted","Data":"50e21b362ba13e42ca9895bb145f6a43eb672c33541c0edce07c5b4afa4868cb"} Apr 17 20:44:44.876171 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.875600 2572 generic.go:358] "Generic (PLEG): container finished" podID="486c966c-4220-4865-a777-76f49bb4fa62" containerID="ad49a32675337afe53605e9acd95c31c65f05f2b6cf39d37f875886a6a7fa612" exitCode=0 Apr 17 20:44:44.876171 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.875641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66wwv" event={"ID":"486c966c-4220-4865-a777-76f49bb4fa62","Type":"ContainerDied","Data":"ad49a32675337afe53605e9acd95c31c65f05f2b6cf39d37f875886a6a7fa612"} Apr 17 20:44:44.930588 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.930553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:44.930714 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.930618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:44.930714 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:44.930680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:44.930838 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:44.930826 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:44.930896 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:44.930886 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls podName:c75b554f-1463-4bda-8049-f6e4988ffef7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:48.930867635 +0000 UTC m=+40.656134984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls") pod "dns-default-zbsnq" (UID: "c75b554f-1463-4bda-8049-f6e4988ffef7") : secret "dns-default-metrics-tls" not found Apr 17 20:44:44.930957 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:44.930943 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:44.930957 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:44.930948 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:44.931058 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:44.930962 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd894dd8c-8msng: secret "image-registry-tls" not found Apr 17 20:44:44.931058 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:44.930973 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert podName:79374dd1-1272-4edf-9d10-449bca8feb97 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:48.930962183 +0000 UTC m=+40.656229526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert") pod "ingress-canary-6qvxz" (UID: "79374dd1-1272-4edf-9d10-449bca8feb97") : secret "canary-serving-cert" not found Apr 17 20:44:44.931058 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:44.930999 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls podName:d441e83e-a15a-4717-ba02-8beb555aad02 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:48.930986866 +0000 UTC m=+40.656254207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls") pod "image-registry-cd894dd8c-8msng" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02") : secret "image-registry-tls" not found Apr 17 20:44:45.880889 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:45.880847 2572 generic.go:358] "Generic (PLEG): container finished" podID="486c966c-4220-4865-a777-76f49bb4fa62" containerID="78af4150e71264048f84f49c4108c9fc091a7f81f3973b4888d57e43366b8bcf" exitCode=0 Apr 17 20:44:45.881293 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:45.880900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66wwv" event={"ID":"486c966c-4220-4865-a777-76f49bb4fa62","Type":"ContainerDied","Data":"78af4150e71264048f84f49c4108c9fc091a7f81f3973b4888d57e43366b8bcf"} Apr 17 20:44:46.886638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:46.886604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66wwv" event={"ID":"486c966c-4220-4865-a777-76f49bb4fa62","Type":"ContainerStarted","Data":"ad23ce3dee7f0f66551c987055e5851a5a06480b6446c305dd91899eb5b65672"} Apr 17 20:44:46.908225 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:46.908152 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-66wwv" podStartSLOduration=6.017462932 podStartE2EDuration="38.90813396s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:11.346658316 +0000 UTC m=+3.071925652" lastFinishedPulling="2026-04-17 20:44:44.237329342 +0000 UTC m=+35.962596680" observedRunningTime="2026-04-17 20:44:46.906594116 +0000 UTC m=+38.631861474" watchObservedRunningTime="2026-04-17 20:44:46.90813396 +0000 UTC m=+38.633401317" Apr 17 20:44:48.963463 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:48.963426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:48.963463 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:48.963469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:48.963879 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:48.963504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:48.963879 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:48.963571 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:48.963879 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:48.963591 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd894dd8c-8msng: secret "image-registry-tls" not found Apr 17 20:44:48.963879 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:48.963597 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:48.963879 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:48.963614 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:48.963879 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:48.963647 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls podName:d441e83e-a15a-4717-ba02-8beb555aad02 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:56.963632547 +0000 UTC m=+48.688899887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls") pod "image-registry-cd894dd8c-8msng" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02") : secret "image-registry-tls" not found Apr 17 20:44:48.963879 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:48.963659 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert podName:79374dd1-1272-4edf-9d10-449bca8feb97 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:56.963654299 +0000 UTC m=+48.688921634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert") pod "ingress-canary-6qvxz" (UID: "79374dd1-1272-4edf-9d10-449bca8feb97") : secret "canary-serving-cert" not found Apr 17 20:44:48.963879 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:48.963669 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls podName:c75b554f-1463-4bda-8049-f6e4988ffef7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:56.963664189 +0000 UTC m=+48.688931524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls") pod "dns-default-zbsnq" (UID: "c75b554f-1463-4bda-8049-f6e4988ffef7") : secret "dns-default-metrics-tls" not found Apr 17 20:44:49.893562 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:49.893526 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4595v" event={"ID":"3c8131bf-3394-4f77-956d-2b283e575873","Type":"ContainerStarted","Data":"71dd23f5ff3f751473deebab0cac91e836294aacfda9bb894f9151c818bdd649"} Apr 17 20:44:49.893751 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:49.893617 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:44:49.894749 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:49.894716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gk7jn" event={"ID":"57c365ae-f2db-4533-9e62-b1193ccbe5c8","Type":"ContainerStarted","Data":"c5c27141bee1eaeb461ce1c67b1adac52d7b7da76d931184ba6dc9f15750519e"} Apr 17 20:44:49.906497 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:49.906458 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4595v" podStartSLOduration=37.047237212 podStartE2EDuration="41.906446951s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:44.340197226 +0000 UTC m=+36.065464560" lastFinishedPulling="2026-04-17 20:44:49.19940696 +0000 UTC m=+40.924674299" observedRunningTime="2026-04-17 20:44:49.905886819 +0000 UTC m=+41.631154177" watchObservedRunningTime="2026-04-17 20:44:49.906446951 +0000 UTC m=+41.631714307" Apr 17 20:44:49.918353 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:49.918307 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gk7jn" podStartSLOduration=10.049453099 podStartE2EDuration="14.918293154s" podCreationTimestamp="2026-04-17 20:44:35 +0000 UTC" firstStartedPulling="2026-04-17 20:44:44.341336746 +0000 UTC m=+36.066604082" lastFinishedPulling="2026-04-17 20:44:49.210176797 +0000 UTC m=+40.935444137" observedRunningTime="2026-04-17 20:44:49.917512363 +0000 UTC m=+41.642779722" watchObservedRunningTime="2026-04-17 20:44:49.918293154 +0000 UTC m=+41.643560511" Apr 17 20:44:57.016641 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:57.016601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:57.016653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:44:57.016676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:57.016755 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:57.016765 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:57.016795 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd894dd8c-8msng: secret "image-registry-tls" not found Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:57.016828 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert podName:79374dd1-1272-4edf-9d10-449bca8feb97 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:13.016796442 +0000 UTC m=+64.742063777 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert") pod "ingress-canary-6qvxz" (UID: "79374dd1-1272-4edf-9d10-449bca8feb97") : secret "canary-serving-cert" not found Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:57.016753 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:57.016879 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls podName:d441e83e-a15a-4717-ba02-8beb555aad02 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:13.016860115 +0000 UTC m=+64.742127449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls") pod "image-registry-cd894dd8c-8msng" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02") : secret "image-registry-tls" not found Apr 17 20:44:57.017248 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:44:57.016897 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls podName:c75b554f-1463-4bda-8049-f6e4988ffef7 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:13.016888875 +0000 UTC m=+64.742156210 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls") pod "dns-default-zbsnq" (UID: "c75b554f-1463-4bda-8049-f6e4988ffef7") : secret "dns-default-metrics-tls" not found Apr 17 20:45:07.867206 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:07.867179 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-trqtd" Apr 17 20:45:13.019434 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:13.019394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:13.019450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:13.019500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:13.019542 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:13.019586 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls podName:c75b554f-1463-4bda-8049-f6e4988ffef7 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:45.019571871 +0000 UTC m=+96.744839206 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls") pod "dns-default-zbsnq" (UID: "c75b554f-1463-4bda-8049-f6e4988ffef7") : secret "dns-default-metrics-tls" not found Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:13.019599 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:13.019613 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd894dd8c-8msng: secret "image-registry-tls" not found Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:13.019649 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:13.019679 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls podName:d441e83e-a15a-4717-ba02-8beb555aad02 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:45.01965952 +0000 UTC m=+96.744926872 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls") pod "image-registry-cd894dd8c-8msng" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02") : secret "image-registry-tls" not found Apr 17 20:45:13.019866 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:13.019697 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert podName:79374dd1-1272-4edf-9d10-449bca8feb97 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:45.019687159 +0000 UTC m=+96.744954501 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert") pod "ingress-canary-6qvxz" (UID: "79374dd1-1272-4edf-9d10-449bca8feb97") : secret "canary-serving-cert" not found Apr 17 20:45:14.430312 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:14.430271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:45:14.430692 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:14.430437 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:45:14.430692 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:14.430505 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:46:18.430489449 +0000 UTC m=+130.155756783 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : secret "metrics-daemon-secret" not found Apr 17 20:45:20.898396 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:20.898265 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4595v" Apr 17 20:45:45.038266 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:45.038219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:45:45.038266 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:45.038273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:45:45.038864 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:45.038365 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:45:45.038864 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:45.038385 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd894dd8c-8msng: secret "image-registry-tls" not found Apr 17 20:45:45.038864 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:45.038388 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:45:45.038864 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:45:45.038441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:45:45.038864 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:45.038454 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert podName:79374dd1-1272-4edf-9d10-449bca8feb97 nodeName:}" failed. No retries permitted until 2026-04-17 20:46:49.038435226 +0000 UTC m=+160.763702582 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert") pod "ingress-canary-6qvxz" (UID: "79374dd1-1272-4edf-9d10-449bca8feb97") : secret "canary-serving-cert" not found Apr 17 20:45:45.038864 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:45.038486 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls podName:d441e83e-a15a-4717-ba02-8beb555aad02 nodeName:}" failed. No retries permitted until 2026-04-17 20:46:49.038468964 +0000 UTC m=+160.763736312 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls") pod "image-registry-cd894dd8c-8msng" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02") : secret "image-registry-tls" not found Apr 17 20:45:45.038864 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:45.038501 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:45:45.038864 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:45:45.038542 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls podName:c75b554f-1463-4bda-8049-f6e4988ffef7 nodeName:}" failed. No retries permitted until 2026-04-17 20:46:49.038530937 +0000 UTC m=+160.763798273 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls") pod "dns-default-zbsnq" (UID: "c75b554f-1463-4bda-8049-f6e4988ffef7") : secret "dns-default-metrics-tls" not found Apr 17 20:46:18.470843 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:18.470779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:46:18.471327 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:18.470925 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:46:18.471327 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:18.470993 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs podName:83681f84-53f3-489d-9b30-0db22fc1b40e nodeName:}" failed. No retries permitted until 2026-04-17 20:48:20.470977513 +0000 UTC m=+252.196244848 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs") pod "network-metrics-daemon-gggxp" (UID: "83681f84-53f3-489d-9b30-0db22fc1b40e") : secret "metrics-daemon-secret" not found Apr 17 20:46:20.204970 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.204937 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-798db9665d-pb48m"] Apr 17 20:46:20.207703 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.207688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.209385 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.209359 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 20:46:20.209682 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.209662 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 20:46:20.209757 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.209679 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:46:20.209757 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.209721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 20:46:20.209886 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.209791 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:46:20.210071 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.210060 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hzs2w\"" Apr 17 20:46:20.210110 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.210059 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 20:46:20.218054 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.218032 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-798db9665d-pb48m"] Apr 17 20:46:20.283340 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.283308 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-stats-auth\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.283516 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.283380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-default-certificate\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.283516 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.283401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.283516 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.283437 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.283516 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.283458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kks5t\" (UniqueName: \"kubernetes.io/projected/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-kube-api-access-kks5t\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.384680 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.384648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-default-certificate\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.384680 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.384683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.384961 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.384705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.384961 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:20.384785 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:46:20.384961 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.384841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kks5t\" (UniqueName: \"kubernetes.io/projected/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-kube-api-access-kks5t\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.384961 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:20.384866 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:20.884848981 +0000 UTC m=+132.610116316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : secret "router-metrics-certs-default" not found Apr 17 20:46:20.384961 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.384914 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-stats-auth\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.384961 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:20.384938 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:20.884921011 +0000 UTC m=+132.610188357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : configmap references non-existent config key: service-ca.crt Apr 17 20:46:20.388059 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.388036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-stats-auth\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.388156 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.388076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-default-certificate\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.392122 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.392098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kks5t\" (UniqueName: \"kubernetes.io/projected/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-kube-api-access-kks5t\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.888574 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.888540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.888574 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:20.888578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:20.888782 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:20.888686 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:46:20.888782 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:20.888713 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:21.888700282 +0000 UTC m=+133.613967618 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : configmap references non-existent config key: service-ca.crt Apr 17 20:46:20.888782 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:20.888746 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:21.888732602 +0000 UTC m=+133.613999937 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : secret "router-metrics-certs-default" not found Apr 17 20:46:21.895636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:21.895603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:21.895636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:21.895639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:21.896059 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:21.895744 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:46:21.896059 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:21.895762 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:23.895749727 +0000 UTC m=+135.621017061 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : configmap references non-existent config key: service-ca.crt Apr 17 20:46:21.896059 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:21.895790 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:23.895778021 +0000 UTC m=+135.621045356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : secret "router-metrics-certs-default" not found Apr 17 20:46:23.910956 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:23.910927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:23.910956 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:23.910962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:23.911496 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:23.911083 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:46:23.911496 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:23.911111 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:27.911097925 +0000 UTC m=+139.636365260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : configmap references non-existent config key: service-ca.crt Apr 17 20:46:23.911496 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:23.911136 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:27.911122581 +0000 UTC m=+139.636389917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : secret "router-metrics-certs-default" not found Apr 17 20:46:27.857907 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:27.857874 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kk8ft_4e6f44f7-3183-4cd5-9000-e9662459d6af/dns-node-resolver/0.log" Apr 17 20:46:27.938027 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:27.937988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:27.938180 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:27.938094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:27.938180 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:27.938147 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:35.938129014 +0000 UTC m=+147.663396349 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : configmap references non-existent config key: service-ca.crt Apr 17 20:46:27.938180 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:27.938173 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:46:27.938297 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:27.938224 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:35.938212887 +0000 UTC m=+147.663480222 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : secret "router-metrics-certs-default" not found Apr 17 20:46:28.857640 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:28.857612 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vh6vg_73fda819-3b91-4892-92f0-995a9a9014c8/node-ca/0.log" Apr 17 20:46:29.288908 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.288876 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j2dtr"] Apr 17 20:46:29.291767 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.291752 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.293675 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.293656 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 20:46:29.293777 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.293657 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jcq4b\"" Apr 17 20:46:29.294368 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.294348 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 20:46:29.294483 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.294368 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 20:46:29.294483 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.294445 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 20:46:29.299096 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.299079 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j2dtr"] Apr 17 20:46:29.450057 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.450029 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e0c1e8b3-e175-489e-bf82-cb1439db22bc-signing-key\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.450291 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.450201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e0c1e8b3-e175-489e-bf82-cb1439db22bc-signing-cabundle\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.450291 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.450228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8h55\" (UniqueName: \"kubernetes.io/projected/e0c1e8b3-e175-489e-bf82-cb1439db22bc-kube-api-access-c8h55\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.550982 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.550906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e0c1e8b3-e175-489e-bf82-cb1439db22bc-signing-cabundle\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.550982 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.550940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8h55\" (UniqueName: \"kubernetes.io/projected/e0c1e8b3-e175-489e-bf82-cb1439db22bc-kube-api-access-c8h55\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.550982 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.550975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e0c1e8b3-e175-489e-bf82-cb1439db22bc-signing-key\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.551523 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.551492 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e0c1e8b3-e175-489e-bf82-cb1439db22bc-signing-cabundle\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.553298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.553281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e0c1e8b3-e175-489e-bf82-cb1439db22bc-signing-key\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.557475 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.557456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8h55\" (UniqueName: \"kubernetes.io/projected/e0c1e8b3-e175-489e-bf82-cb1439db22bc-kube-api-access-c8h55\") pod \"service-ca-865cb79987-j2dtr\" (UID: \"e0c1e8b3-e175-489e-bf82-cb1439db22bc\") " pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.600955 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.600924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-j2dtr" Apr 17 20:46:29.709566 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:29.709501 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j2dtr"] Apr 17 20:46:29.712126 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:46:29.712092 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0c1e8b3_e175_489e_bf82_cb1439db22bc.slice/crio-719ab3a03bdc22b6622c6b493ee04352f72bc25dd05fa3559f0021b1309f3a2f WatchSource:0}: Error finding container 719ab3a03bdc22b6622c6b493ee04352f72bc25dd05fa3559f0021b1309f3a2f: Status 404 returned error can't find the container with id 719ab3a03bdc22b6622c6b493ee04352f72bc25dd05fa3559f0021b1309f3a2f Apr 17 20:46:30.078694 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:30.078657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-j2dtr" event={"ID":"e0c1e8b3-e175-489e-bf82-cb1439db22bc","Type":"ContainerStarted","Data":"719ab3a03bdc22b6622c6b493ee04352f72bc25dd05fa3559f0021b1309f3a2f"} Apr 17 20:46:32.084688 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:32.084656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-j2dtr" event={"ID":"e0c1e8b3-e175-489e-bf82-cb1439db22bc","Type":"ContainerStarted","Data":"d103c8183f9019dfe9dc888fab0423b5d92c5615e1928ad3a005b229901b9147"} Apr 17 20:46:32.099226 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:32.099178 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-j2dtr" podStartSLOduration=1.26527964 podStartE2EDuration="3.099162918s" podCreationTimestamp="2026-04-17 20:46:29 +0000 UTC" firstStartedPulling="2026-04-17 20:46:29.713833236 +0000 UTC m=+141.439100574" lastFinishedPulling="2026-04-17 20:46:31.547716516 +0000 UTC m=+143.272983852" observedRunningTime="2026-04-17 20:46:32.098385714 +0000 UTC m=+143.823653068" watchObservedRunningTime="2026-04-17 20:46:32.099162918 +0000 UTC m=+143.824430274" Apr 17 20:46:35.999992 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:35.999960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:35.999992 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:35.999999 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:36.000389 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:36.000106 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:46:36.000389 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:36.000123 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:52.000106024 +0000 UTC m=+163.725373387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : configmap references non-existent config key: service-ca.crt Apr 17 20:46:36.000389 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:36.000149 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs podName:b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b nodeName:}" failed. No retries permitted until 2026-04-17 20:46:52.000138989 +0000 UTC m=+163.725406332 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs") pod "router-default-798db9665d-pb48m" (UID: "b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b") : secret "router-metrics-certs-default" not found Apr 17 20:46:44.165892 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:44.165856 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-cd894dd8c-8msng" podUID="d441e83e-a15a-4717-ba02-8beb555aad02" Apr 17 20:46:44.177285 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:44.177246 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zbsnq" podUID="c75b554f-1463-4bda-8049-f6e4988ffef7" Apr 17 20:46:44.183392 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:44.183365 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6qvxz" podUID="79374dd1-1272-4edf-9d10-449bca8feb97" Apr 17 20:46:44.723327 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:44.723292 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gggxp" podUID="83681f84-53f3-489d-9b30-0db22fc1b40e" Apr 17 20:46:45.109012 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:45.108987 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:46:45.109168 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:45.109017 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:46:45.109168 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:45.109133 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zbsnq" Apr 17 20:46:48.667151 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.667118 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nxrn6"] Apr 17 20:46:48.669947 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.669930 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.672520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.672487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jrrs2\"" Apr 17 20:46:48.672652 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.672523 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:46:48.672652 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.672531 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:46:48.672652 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.672563 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:46:48.672652 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.672569 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:46:48.681550 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.681527 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nxrn6"] Apr 17 20:46:48.692025 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.691999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4957c690-d1ba-429f-802c-407963355d82-data-volume\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.692106 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.692073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vhks\" (UniqueName: \"kubernetes.io/projected/4957c690-d1ba-429f-802c-407963355d82-kube-api-access-2vhks\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.692106 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.692095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4957c690-d1ba-429f-802c-407963355d82-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.692178 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.692165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4957c690-d1ba-429f-802c-407963355d82-crio-socket\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.692214 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.692190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4957c690-d1ba-429f-802c-407963355d82-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.724478 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.724457 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-cd894dd8c-8msng"] Apr 17 20:46:48.724616 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:48.724592 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-cd894dd8c-8msng" podUID="d441e83e-a15a-4717-ba02-8beb555aad02" Apr 17 20:46:48.793033 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.793010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vhks\" (UniqueName: \"kubernetes.io/projected/4957c690-d1ba-429f-802c-407963355d82-kube-api-access-2vhks\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.793140 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.793040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4957c690-d1ba-429f-802c-407963355d82-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.793140 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.793078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4957c690-d1ba-429f-802c-407963355d82-crio-socket\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.793140 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.793095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4957c690-d1ba-429f-802c-407963355d82-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.793140 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.793132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4957c690-d1ba-429f-802c-407963355d82-data-volume\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.793368 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.793154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4957c690-d1ba-429f-802c-407963355d82-crio-socket\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.793465 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.793450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4957c690-d1ba-429f-802c-407963355d82-data-volume\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.793634 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.793619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4957c690-d1ba-429f-802c-407963355d82-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.795278 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.795258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4957c690-d1ba-429f-802c-407963355d82-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.833552 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.833532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vhks\" (UniqueName: \"kubernetes.io/projected/4957c690-d1ba-429f-802c-407963355d82-kube-api-access-2vhks\") pod \"insights-runtime-extractor-nxrn6\" (UID: \"4957c690-d1ba-429f-802c-407963355d82\") " pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:48.978363 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:48.978289 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nxrn6" Apr 17 20:46:49.089497 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.089469 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nxrn6"] Apr 17 20:46:49.092735 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:46:49.092709 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4957c690_d1ba_429f_802c_407963355d82.slice/crio-f617b74213512ee4ca596e93f84f3d583a4d5c293a8941fccd609e10b7bda126 WatchSource:0}: Error finding container f617b74213512ee4ca596e93f84f3d583a4d5c293a8941fccd609e10b7bda126: Status 404 returned error can't find the container with id f617b74213512ee4ca596e93f84f3d583a4d5c293a8941fccd609e10b7bda126 Apr 17 20:46:49.095317 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.095288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:46:49.095408 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.095335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:46:49.095408 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.095373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:46:49.097752 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.097733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"image-registry-cd894dd8c-8msng\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:46:49.098028 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.098010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c75b554f-1463-4bda-8049-f6e4988ffef7-metrics-tls\") pod \"dns-default-zbsnq\" (UID: \"c75b554f-1463-4bda-8049-f6e4988ffef7\") " pod="openshift-dns/dns-default-zbsnq" Apr 17 20:46:49.098098 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.098082 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79374dd1-1272-4edf-9d10-449bca8feb97-cert\") pod \"ingress-canary-6qvxz\" (UID: \"79374dd1-1272-4edf-9d10-449bca8feb97\") " pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:46:49.118087 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.118058 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nxrn6" event={"ID":"4957c690-d1ba-429f-802c-407963355d82","Type":"ContainerStarted","Data":"f617b74213512ee4ca596e93f84f3d583a4d5c293a8941fccd609e10b7bda126"} Apr 17 20:46:49.118170 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.118099 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:46:49.129580 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.129562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:46:49.196536 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196509 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d441e83e-a15a-4717-ba02-8beb555aad02-ca-trust-extracted\") pod \"d441e83e-a15a-4717-ba02-8beb555aad02\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " Apr 17 20:46:49.196677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196552 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rblzb\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-kube-api-access-rblzb\") pod \"d441e83e-a15a-4717-ba02-8beb555aad02\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " Apr 17 20:46:49.196677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196586 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-image-registry-private-configuration\") pod \"d441e83e-a15a-4717-ba02-8beb555aad02\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " Apr 17 20:46:49.196677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196619 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-bound-sa-token\") pod \"d441e83e-a15a-4717-ba02-8beb555aad02\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " Apr 17 20:46:49.196677 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196666 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-installation-pull-secrets\") pod \"d441e83e-a15a-4717-ba02-8beb555aad02\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " Apr 17 20:46:49.196878 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196834 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") pod \"d441e83e-a15a-4717-ba02-8beb555aad02\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " Apr 17 20:46:49.196938 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196884 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-registry-certificates\") pod \"d441e83e-a15a-4717-ba02-8beb555aad02\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " Apr 17 20:46:49.196938 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196917 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-trusted-ca\") pod \"d441e83e-a15a-4717-ba02-8beb555aad02\" (UID: \"d441e83e-a15a-4717-ba02-8beb555aad02\") " Apr 17 20:46:49.197419 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.196837 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d441e83e-a15a-4717-ba02-8beb555aad02-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d441e83e-a15a-4717-ba02-8beb555aad02" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:46:49.197539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.197340 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d441e83e-a15a-4717-ba02-8beb555aad02" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:46:49.197539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.197445 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d441e83e-a15a-4717-ba02-8beb555aad02" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:46:49.198935 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.198898 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-kube-api-access-rblzb" (OuterVolumeSpecName: "kube-api-access-rblzb") pod "d441e83e-a15a-4717-ba02-8beb555aad02" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02"). InnerVolumeSpecName "kube-api-access-rblzb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:46:49.198935 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.198918 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d441e83e-a15a-4717-ba02-8beb555aad02" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:46:49.199075 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.198973 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d441e83e-a15a-4717-ba02-8beb555aad02" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:46:49.199134 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.199093 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d441e83e-a15a-4717-ba02-8beb555aad02" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:46:49.199134 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.199099 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d441e83e-a15a-4717-ba02-8beb555aad02" (UID: "d441e83e-a15a-4717-ba02-8beb555aad02"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:46:49.298411 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.298383 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-installation-pull-secrets\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:46:49.298411 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.298406 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-registry-tls\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:46:49.298411 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.298416 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-registry-certificates\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:46:49.298636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.298425 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d441e83e-a15a-4717-ba02-8beb555aad02-trusted-ca\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:46:49.298636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.298435 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d441e83e-a15a-4717-ba02-8beb555aad02-ca-trust-extracted\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:46:49.298636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.298444 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rblzb\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-kube-api-access-rblzb\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:46:49.298636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.298453 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d441e83e-a15a-4717-ba02-8beb555aad02-image-registry-private-configuration\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:46:49.298636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.298463 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d441e83e-a15a-4717-ba02-8beb555aad02-bound-sa-token\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:46:49.312144 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.312123 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8pvd4\"" Apr 17 20:46:49.312210 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.312145 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pn5wd\"" Apr 17 20:46:49.320645 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.320623 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6qvxz" Apr 17 20:46:49.320724 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.320648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zbsnq" Apr 17 20:46:49.443458 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.443418 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6qvxz"] Apr 17 20:46:49.447780 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:46:49.447753 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79374dd1_1272_4edf_9d10_449bca8feb97.slice/crio-a684f69f9b73a97f310ad66030394129360e84b705bf813151755308747ddfaa WatchSource:0}: Error finding container a684f69f9b73a97f310ad66030394129360e84b705bf813151755308747ddfaa: Status 404 returned error can't find the container with id a684f69f9b73a97f310ad66030394129360e84b705bf813151755308747ddfaa Apr 17 20:46:49.462017 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:49.461994 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zbsnq"] Apr 17 20:46:49.464583 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:46:49.464558 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75b554f_1463_4bda_8049_f6e4988ffef7.slice/crio-b6d09a1405643385b19f76545deea23e5700ad0f29edf0597d29ddd93a5be159 WatchSource:0}: Error finding container b6d09a1405643385b19f76545deea23e5700ad0f29edf0597d29ddd93a5be159: Status 404 returned error can't find the container with id b6d09a1405643385b19f76545deea23e5700ad0f29edf0597d29ddd93a5be159 Apr 17 20:46:50.121835 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:50.121742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zbsnq" event={"ID":"c75b554f-1463-4bda-8049-f6e4988ffef7","Type":"ContainerStarted","Data":"b6d09a1405643385b19f76545deea23e5700ad0f29edf0597d29ddd93a5be159"} Apr 17 20:46:50.123367 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:50.123339 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nxrn6" event={"ID":"4957c690-d1ba-429f-802c-407963355d82","Type":"ContainerStarted","Data":"5a73c4395c7726e60afe4d04ddf4b4c3c20ceaf6dbfe666bf1cb3522353c9d97"} Apr 17 20:46:50.123486 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:50.123371 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nxrn6" event={"ID":"4957c690-d1ba-429f-802c-407963355d82","Type":"ContainerStarted","Data":"60953b90fd4137db6f6e8924785fe6a604e4c40eff9ea6108d75b287ba16be54"} Apr 17 20:46:50.124307 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:50.124286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6qvxz" event={"ID":"79374dd1-1272-4edf-9d10-449bca8feb97","Type":"ContainerStarted","Data":"a684f69f9b73a97f310ad66030394129360e84b705bf813151755308747ddfaa"} Apr 17 20:46:50.124415 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:50.124323 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd894dd8c-8msng" Apr 17 20:46:50.151036 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:50.151007 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-cd894dd8c-8msng"] Apr 17 20:46:50.154189 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:50.154170 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-cd894dd8c-8msng"] Apr 17 20:46:50.715075 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:50.715034 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d441e83e-a15a-4717-ba02-8beb555aad02" path="/var/lib/kubelet/pods/d441e83e-a15a-4717-ba02-8beb555aad02/volumes" Apr 17 20:46:52.019264 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.019243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:52.019620 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.019275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:52.019828 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.019785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-service-ca-bundle\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:52.021662 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.021629 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b-metrics-certs\") pod \"router-default-798db9665d-pb48m\" (UID: \"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b\") " pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:52.131529 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.131498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zbsnq" event={"ID":"c75b554f-1463-4bda-8049-f6e4988ffef7","Type":"ContainerStarted","Data":"3c76e565adcbebc416c3f12d69a6c1e719e78bf1d0ee3ea4f35dca18cd53e4ce"} Apr 17 20:46:52.133575 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.133549 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nxrn6" event={"ID":"4957c690-d1ba-429f-802c-407963355d82","Type":"ContainerStarted","Data":"d534826f9ec48583399d5e2e5906a2ae555fc64495f3514f09ff2eb42b745fb2"} Apr 17 20:46:52.135299 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.135274 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6qvxz" event={"ID":"79374dd1-1272-4edf-9d10-449bca8feb97","Type":"ContainerStarted","Data":"76d312fd294c611a54cd5ff311feb950eb61bf02e31de97ac91143bcd9d361af"} Apr 17 20:46:52.148190 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.148144 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nxrn6" podStartSLOduration=1.284598541 podStartE2EDuration="4.148129723s" podCreationTimestamp="2026-04-17 20:46:48 +0000 UTC" firstStartedPulling="2026-04-17 20:46:49.149841735 +0000 UTC m=+160.875109070" lastFinishedPulling="2026-04-17 20:46:52.013372912 +0000 UTC m=+163.738640252" observedRunningTime="2026-04-17 20:46:52.147337991 +0000 UTC m=+163.872605367" watchObservedRunningTime="2026-04-17 20:46:52.148129723 +0000 UTC m=+163.873397080" Apr 17 20:46:52.316295 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.316264 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:52.427362 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.427303 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6qvxz" podStartSLOduration=128.86416244 podStartE2EDuration="2m11.427284988s" podCreationTimestamp="2026-04-17 20:44:41 +0000 UTC" firstStartedPulling="2026-04-17 20:46:49.449737392 +0000 UTC m=+161.175004730" lastFinishedPulling="2026-04-17 20:46:52.012859944 +0000 UTC m=+163.738127278" observedRunningTime="2026-04-17 20:46:52.162423261 +0000 UTC m=+163.887690621" watchObservedRunningTime="2026-04-17 20:46:52.427284988 +0000 UTC m=+164.152552339" Apr 17 20:46:52.427650 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:52.427635 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-798db9665d-pb48m"] Apr 17 20:46:52.430464 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:46:52.430440 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb366eb11_7fda_4ca8_ae8e_3fb7b0b24a8b.slice/crio-731baba904a14f959f14cca1e65712832b4b18e7835dc3df407bb5b45240f996 WatchSource:0}: Error finding container 731baba904a14f959f14cca1e65712832b4b18e7835dc3df407bb5b45240f996: Status 404 returned error can't find the container with id 731baba904a14f959f14cca1e65712832b4b18e7835dc3df407bb5b45240f996 Apr 17 20:46:53.140969 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:53.140936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zbsnq" event={"ID":"c75b554f-1463-4bda-8049-f6e4988ffef7","Type":"ContainerStarted","Data":"3131a6440495c3f37889939087eafea214a51f8918d1b9957525afde970fefbf"} Apr 17 20:46:53.141410 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:53.141023 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zbsnq" Apr 17 20:46:53.142107 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:53.142087 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-798db9665d-pb48m" event={"ID":"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b","Type":"ContainerStarted","Data":"6b6c553dfa85becadc8bf618819404963f8481a201ec7d85353060356758f366"} Apr 17 20:46:53.142107 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:53.142109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-798db9665d-pb48m" event={"ID":"b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b","Type":"ContainerStarted","Data":"731baba904a14f959f14cca1e65712832b4b18e7835dc3df407bb5b45240f996"} Apr 17 20:46:53.155612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:53.155574 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zbsnq" podStartSLOduration=129.611268302 podStartE2EDuration="2m12.155561859s" podCreationTimestamp="2026-04-17 20:44:41 +0000 UTC" firstStartedPulling="2026-04-17 20:46:49.466313103 +0000 UTC m=+161.191580441" lastFinishedPulling="2026-04-17 20:46:52.010606663 +0000 UTC m=+163.735873998" observedRunningTime="2026-04-17 20:46:53.154467264 +0000 UTC m=+164.879734632" watchObservedRunningTime="2026-04-17 20:46:53.155561859 +0000 UTC m=+164.880829215" Apr 17 20:46:53.317149 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:53.317115 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:53.319631 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:53.319608 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:53.337727 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:53.337686 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-798db9665d-pb48m" podStartSLOduration=33.337674148 podStartE2EDuration="33.337674148s" podCreationTimestamp="2026-04-17 20:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:46:53.169609653 +0000 UTC m=+164.894877032" watchObservedRunningTime="2026-04-17 20:46:53.337674148 +0000 UTC m=+165.062941504" Apr 17 20:46:54.144707 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:54.144674 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:54.146014 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:54.145995 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-798db9665d-pb48m" Apr 17 20:46:54.858864 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:54.858835 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt"] Apr 17 20:46:54.861998 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:54.861977 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" Apr 17 20:46:54.863793 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:54.863773 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 20:46:54.863887 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:54.863773 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-6mrjq\"" Apr 17 20:46:54.867735 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:54.867714 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt"] Apr 17 20:46:54.940795 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:54.940768 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/50669672-c6e8-45ba-a563-ff0834c4d0bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pfxbt\" (UID: \"50669672-c6e8-45ba-a563-ff0834c4d0bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" Apr 17 20:46:55.040995 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:55.040969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/50669672-c6e8-45ba-a563-ff0834c4d0bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pfxbt\" (UID: \"50669672-c6e8-45ba-a563-ff0834c4d0bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" Apr 17 20:46:55.041130 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:55.041113 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 20:46:55.041184 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:46:55.041173 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50669672-c6e8-45ba-a563-ff0834c4d0bf-tls-certificates podName:50669672-c6e8-45ba-a563-ff0834c4d0bf nodeName:}" failed. No retries permitted until 2026-04-17 20:46:55.541158445 +0000 UTC m=+167.266425780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/50669672-c6e8-45ba-a563-ff0834c4d0bf-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-pfxbt" (UID: "50669672-c6e8-45ba-a563-ff0834c4d0bf") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 20:46:55.544520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:55.544480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/50669672-c6e8-45ba-a563-ff0834c4d0bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pfxbt\" (UID: \"50669672-c6e8-45ba-a563-ff0834c4d0bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" Apr 17 20:46:55.546975 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:55.546954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/50669672-c6e8-45ba-a563-ff0834c4d0bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pfxbt\" (UID: \"50669672-c6e8-45ba-a563-ff0834c4d0bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" Apr 17 20:46:55.772560 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:55.772522 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" Apr 17 20:46:55.883441 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:55.883409 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt"] Apr 17 20:46:55.886291 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:46:55.886258 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50669672_c6e8_45ba_a563_ff0834c4d0bf.slice/crio-a870f2077bb8cd9f42ecb76a71a1cc05baf497808fccc50d055734b00e986e04 WatchSource:0}: Error finding container a870f2077bb8cd9f42ecb76a71a1cc05baf497808fccc50d055734b00e986e04: Status 404 returned error can't find the container with id a870f2077bb8cd9f42ecb76a71a1cc05baf497808fccc50d055734b00e986e04 Apr 17 20:46:56.150525 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:56.150448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" event={"ID":"50669672-c6e8-45ba-a563-ff0834c4d0bf","Type":"ContainerStarted","Data":"a870f2077bb8cd9f42ecb76a71a1cc05baf497808fccc50d055734b00e986e04"} Apr 17 20:46:56.708555 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:56.708520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:46:57.154292 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.154263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" event={"ID":"50669672-c6e8-45ba-a563-ff0834c4d0bf","Type":"ContainerStarted","Data":"dfb182e3f103711d78d26a71ebeb90b6418dd6dc2709b697da12d558ca2543ee"} Apr 17 20:46:57.154468 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.154452 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" Apr 17 20:46:57.158838 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.158820 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" Apr 17 20:46:57.167328 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.167291 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pfxbt" podStartSLOduration=1.98032236 podStartE2EDuration="3.167278584s" podCreationTimestamp="2026-04-17 20:46:54 +0000 UTC" firstStartedPulling="2026-04-17 20:46:55.888070421 +0000 UTC m=+167.613337761" lastFinishedPulling="2026-04-17 20:46:57.07502664 +0000 UTC m=+168.800293985" observedRunningTime="2026-04-17 20:46:57.16634235 +0000 UTC m=+168.891609706" watchObservedRunningTime="2026-04-17 20:46:57.167278584 +0000 UTC m=+168.892545938" Apr 17 20:46:57.919874 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.919843 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mlb85"] Apr 17 20:46:57.922768 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.922752 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:57.925139 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.925104 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:46:57.925273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.925149 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:46:57.925273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.925168 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 20:46:57.925273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.925213 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:46:57.925273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.925217 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-h6fkf\"" Apr 17 20:46:57.925273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.925149 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 20:46:57.931691 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.931674 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mlb85"] Apr 17 20:46:57.963037 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.963012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39b7343d-bfa3-457e-8d3b-64225e9b2a48-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:57.963138 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.963051 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-574j6\" (UniqueName: \"kubernetes.io/projected/39b7343d-bfa3-457e-8d3b-64225e9b2a48-kube-api-access-574j6\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:57.963138 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.963081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39b7343d-bfa3-457e-8d3b-64225e9b2a48-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:57.963215 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:57.963125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b7343d-bfa3-457e-8d3b-64225e9b2a48-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.063991 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.063958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39b7343d-bfa3-457e-8d3b-64225e9b2a48-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.064127 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.064005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-574j6\" (UniqueName: \"kubernetes.io/projected/39b7343d-bfa3-457e-8d3b-64225e9b2a48-kube-api-access-574j6\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.064127 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.064035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39b7343d-bfa3-457e-8d3b-64225e9b2a48-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.064127 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.064053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b7343d-bfa3-457e-8d3b-64225e9b2a48-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.064767 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.064748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39b7343d-bfa3-457e-8d3b-64225e9b2a48-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.066336 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.066318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39b7343d-bfa3-457e-8d3b-64225e9b2a48-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.066489 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.066472 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b7343d-bfa3-457e-8d3b-64225e9b2a48-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.070609 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.070590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-574j6\" (UniqueName: \"kubernetes.io/projected/39b7343d-bfa3-457e-8d3b-64225e9b2a48-kube-api-access-574j6\") pod \"prometheus-operator-5676c8c784-mlb85\" (UID: \"39b7343d-bfa3-457e-8d3b-64225e9b2a48\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.231905 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.231826 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" Apr 17 20:46:58.362539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:58.362506 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mlb85"] Apr 17 20:46:58.366469 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:46:58.366439 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b7343d_bfa3_457e_8d3b_64225e9b2a48.slice/crio-ae148b6022909d1b6a263c5bb6509006fdba770ad0c177a43d3b678243f4f229 WatchSource:0}: Error finding container ae148b6022909d1b6a263c5bb6509006fdba770ad0c177a43d3b678243f4f229: Status 404 returned error can't find the container with id ae148b6022909d1b6a263c5bb6509006fdba770ad0c177a43d3b678243f4f229 Apr 17 20:46:59.166117 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:46:59.166080 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" event={"ID":"39b7343d-bfa3-457e-8d3b-64225e9b2a48","Type":"ContainerStarted","Data":"ae148b6022909d1b6a263c5bb6509006fdba770ad0c177a43d3b678243f4f229"} Apr 17 20:47:00.170063 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:00.170028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" event={"ID":"39b7343d-bfa3-457e-8d3b-64225e9b2a48","Type":"ContainerStarted","Data":"569a9c2758e9dfb620f5158be7438454412f20a22e1afc6c1ef43854b2981830"} Apr 17 20:47:00.170063 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:00.170067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" event={"ID":"39b7343d-bfa3-457e-8d3b-64225e9b2a48","Type":"ContainerStarted","Data":"d57785a310b73d77ec2c7fa6c224f4f0659670d62128df8e3d4dd2202ca57891"} Apr 17 20:47:00.185363 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:00.185312 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-mlb85" podStartSLOduration=1.981313864 podStartE2EDuration="3.185285104s" podCreationTimestamp="2026-04-17 20:46:57 +0000 UTC" firstStartedPulling="2026-04-17 20:46:58.368517905 +0000 UTC m=+170.093785243" lastFinishedPulling="2026-04-17 20:46:59.572489147 +0000 UTC m=+171.297756483" observedRunningTime="2026-04-17 20:47:00.184762499 +0000 UTC m=+171.910029857" watchObservedRunningTime="2026-04-17 20:47:00.185285104 +0000 UTC m=+171.910552461" Apr 17 20:47:02.269676 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.269641 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h"] Apr 17 20:47:02.273383 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.273359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.275905 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.275679 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 20:47:02.275905 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.275705 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:47:02.275905 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.275721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-rh58l\"" Apr 17 20:47:02.285813 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.284834 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hbxqp"] Apr 17 20:47:02.292982 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.292961 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h"] Apr 17 20:47:02.293239 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.293217 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.295922 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.295357 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:47:02.296353 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.296335 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s8fns\"" Apr 17 20:47:02.296517 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.296496 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:47:02.296673 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.296435 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:47:02.298958 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.298937 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cdmls"] Apr 17 20:47:02.302491 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.302294 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.304421 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.304400 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:47:02.304629 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.304609 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 20:47:02.304884 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.304864 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 20:47:02.305079 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.304957 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-fn578\"" Apr 17 20:47:02.320737 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.320712 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cdmls"] Apr 17 20:47:02.397176 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397144 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eac69935-3fd7-4a29-9949-4a6e86df9ae0-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.397360 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-textfile\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.397360 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4390455-ccf6-4110-ab32-8ce17b8d9693-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.397360 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397256 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-tls\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.397360 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397310 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.397360 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4390455-ccf6-4110-ab32-8ce17b8d9693-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/eac69935-3fd7-4a29-9949-4a6e86df9ae0-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397463 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397490 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd5tp\" (UniqueName: \"kubernetes.io/projected/d4390455-ccf6-4110-ab32-8ce17b8d9693-kube-api-access-bd5tp\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-wtmp\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bbfd575-a18e-4b5a-b924-f78e8e962f05-metrics-client-ca\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4390455-ccf6-4110-ab32-8ce17b8d9693-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397599 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.397612 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-sys\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.398049 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-root\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.398049 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-accelerators-collector-config\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.398049 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrtgj\" (UniqueName: \"kubernetes.io/projected/6bbfd575-a18e-4b5a-b924-f78e8e962f05-kube-api-access-nrtgj\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.398049 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.397692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grpwk\" (UniqueName: \"kubernetes.io/projected/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-api-access-grpwk\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.498319 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-wtmp\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.498319 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bbfd575-a18e-4b5a-b924-f78e8e962f05-metrics-client-ca\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.498563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4390455-ccf6-4110-ab32-8ce17b8d9693-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.498563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.498563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-sys\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.498563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-root\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.498563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-accelerators-collector-config\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.498563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-wtmp\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.498563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-root\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.498563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498509 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bbfd575-a18e-4b5a-b924-f78e8e962f05-sys\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.499000 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrtgj\" (UniqueName: \"kubernetes.io/projected/6bbfd575-a18e-4b5a-b924-f78e8e962f05-kube-api-access-nrtgj\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.499000 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grpwk\" (UniqueName: \"kubernetes.io/projected/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-api-access-grpwk\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.499000 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.498678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eac69935-3fd7-4a29-9949-4a6e86df9ae0-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.499143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bbfd575-a18e-4b5a-b924-f78e8e962f05-metrics-client-ca\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.499143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-accelerators-collector-config\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.499143 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-textfile\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.499292 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4390455-ccf6-4110-ab32-8ce17b8d9693-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.499292 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-tls\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.499396 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:47:02.499376 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 20:47:02.499455 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:47:02.499443 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-tls podName:6bbfd575-a18e-4b5a-b924-f78e8e962f05 nodeName:}" failed. No retries permitted until 2026-04-17 20:47:02.999423645 +0000 UTC m=+174.724690984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-tls") pod "node-exporter-hbxqp" (UID: "6bbfd575-a18e-4b5a-b924-f78e8e962f05") : secret "node-exporter-tls" not found Apr 17 20:47:02.499522 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eac69935-3fd7-4a29-9949-4a6e86df9ae0-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.499522 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.499628 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.499628 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499563 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4390455-ccf6-4110-ab32-8ce17b8d9693-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.499628 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/eac69935-3fd7-4a29-9949-4a6e86df9ae0-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.499907 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.499907 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.499673 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd5tp\" (UniqueName: \"kubernetes.io/projected/d4390455-ccf6-4110-ab32-8ce17b8d9693-kube-api-access-bd5tp\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.500189 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.500169 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-textfile\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.500306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.500281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/eac69935-3fd7-4a29-9949-4a6e86df9ae0-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.500490 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.500303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.500611 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.500303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4390455-ccf6-4110-ab32-8ce17b8d9693-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.501435 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.501408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4390455-ccf6-4110-ab32-8ce17b8d9693-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.501545 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.501525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.502067 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.502031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.502373 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.502352 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4390455-ccf6-4110-ab32-8ce17b8d9693-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.502751 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.502727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.506536 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.506513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrtgj\" (UniqueName: \"kubernetes.io/projected/6bbfd575-a18e-4b5a-b924-f78e8e962f05-kube-api-access-nrtgj\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:02.506857 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.506798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grpwk\" (UniqueName: \"kubernetes.io/projected/eac69935-3fd7-4a29-9949-4a6e86df9ae0-kube-api-access-grpwk\") pod \"kube-state-metrics-69db897b98-cdmls\" (UID: \"eac69935-3fd7-4a29-9949-4a6e86df9ae0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.507733 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.507718 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd5tp\" (UniqueName: \"kubernetes.io/projected/d4390455-ccf6-4110-ab32-8ce17b8d9693-kube-api-access-bd5tp\") pod \"openshift-state-metrics-9d44df66c-cdg4h\" (UID: \"d4390455-ccf6-4110-ab32-8ce17b8d9693\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.584449 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.584416 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" Apr 17 20:47:02.613519 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.613485 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" Apr 17 20:47:02.718339 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.718311 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h"] Apr 17 20:47:02.720849 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:02.720817 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4390455_ccf6_4110_ab32_8ce17b8d9693.slice/crio-9275ef42faf70ce1352760953c1e4a2773816dff3d222a061c66859f488c7ad8 WatchSource:0}: Error finding container 9275ef42faf70ce1352760953c1e4a2773816dff3d222a061c66859f488c7ad8: Status 404 returned error can't find the container with id 9275ef42faf70ce1352760953c1e4a2773816dff3d222a061c66859f488c7ad8 Apr 17 20:47:02.744276 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:02.744252 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cdmls"] Apr 17 20:47:02.750863 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:02.750835 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeac69935_3fd7_4a29_9949_4a6e86df9ae0.slice/crio-6efef70f3f33c0953e80c1bde9ab52521ec32580dfb1b3f01760d382ec88d69a WatchSource:0}: Error finding container 6efef70f3f33c0953e80c1bde9ab52521ec32580dfb1b3f01760d382ec88d69a: Status 404 returned error can't find the container with id 6efef70f3f33c0953e80c1bde9ab52521ec32580dfb1b3f01760d382ec88d69a Apr 17 20:47:03.004986 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.004949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-tls\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:03.007234 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.007216 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6bbfd575-a18e-4b5a-b924-f78e8e962f05-node-exporter-tls\") pod \"node-exporter-hbxqp\" (UID: \"6bbfd575-a18e-4b5a-b924-f78e8e962f05\") " pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:03.147227 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.147145 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zbsnq" Apr 17 20:47:03.180712 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.180677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" event={"ID":"eac69935-3fd7-4a29-9949-4a6e86df9ae0","Type":"ContainerStarted","Data":"6efef70f3f33c0953e80c1bde9ab52521ec32580dfb1b3f01760d382ec88d69a"} Apr 17 20:47:03.182297 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.182270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" event={"ID":"d4390455-ccf6-4110-ab32-8ce17b8d9693","Type":"ContainerStarted","Data":"87afa08d451f63d47a9254cad16c4b0923be8bf27870279a47c46b1f81cdbecc"} Apr 17 20:47:03.182409 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.182302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" event={"ID":"d4390455-ccf6-4110-ab32-8ce17b8d9693","Type":"ContainerStarted","Data":"0d7a35430c6f3bf7ad1b3274f8450f13b1af356dcd4f36d123cdd4c5d2b80688"} Apr 17 20:47:03.182409 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.182315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" event={"ID":"d4390455-ccf6-4110-ab32-8ce17b8d9693","Type":"ContainerStarted","Data":"9275ef42faf70ce1352760953c1e4a2773816dff3d222a061c66859f488c7ad8"} Apr 17 20:47:03.205498 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.205468 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hbxqp" Apr 17 20:47:03.213788 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:03.213757 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bbfd575_a18e_4b5a_b924_f78e8e962f05.slice/crio-b7e9af1702ef82173cea933a96e7fb0c4e002678319c1153c1797a44e2014dc7 WatchSource:0}: Error finding container b7e9af1702ef82173cea933a96e7fb0c4e002678319c1153c1797a44e2014dc7: Status 404 returned error can't find the container with id b7e9af1702ef82173cea933a96e7fb0c4e002678319c1153c1797a44e2014dc7 Apr 17 20:47:03.341228 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.341190 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:47:03.346513 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.346487 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.348401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.348377 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 20:47:03.348620 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.348602 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 20:47:03.348748 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.348706 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 20:47:03.348748 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.348729 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 20:47:03.349154 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.348990 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 20:47:03.349154 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.348999 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 20:47:03.349154 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.349016 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 20:47:03.349154 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.349027 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 20:47:03.349154 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.349021 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 20:47:03.349154 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.349018 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ggvr5\"" Apr 17 20:47:03.356833 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.356795 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:47:03.407504 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.407504 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-out\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.407504 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.407759 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407547 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.407759 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407581 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.407759 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407603 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-web-config\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.407759 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nftd\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-kube-api-access-4nftd\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.407759 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407704 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.407759 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.408090 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.408090 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.408090 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.408090 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.407922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.508658 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-out\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.508658 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.508930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.508930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.508930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-web-config\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.508930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nftd\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-kube-api-access-4nftd\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.508930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.508930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.509224 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508946 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.509224 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.509224 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.508995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.509224 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.509029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.509224 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.509058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.510363 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.510220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.510363 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.510326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.510582 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.510555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.513930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.513885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.515992 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.515971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.517710 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.517606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.517818 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.517731 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.518535 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.518010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-web-config\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.518535 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.518029 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-out\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.518535 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.518109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nftd\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-kube-api-access-4nftd\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.518535 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.518254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.518535 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.518503 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.521213 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.521161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.657015 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.656982 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:47:03.814394 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:03.814343 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:47:03.996514 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:03.996447 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde33ee91_814f_4e24_92a9_20b1ec5751a9.slice/crio-1f4c13a2a13b8b863ba04cd7c9dfdc0363b153734a914ec59cdcbfe9750f3df3 WatchSource:0}: Error finding container 1f4c13a2a13b8b863ba04cd7c9dfdc0363b153734a914ec59cdcbfe9750f3df3: Status 404 returned error can't find the container with id 1f4c13a2a13b8b863ba04cd7c9dfdc0363b153734a914ec59cdcbfe9750f3df3 Apr 17 20:47:04.186914 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:04.186838 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hbxqp" event={"ID":"6bbfd575-a18e-4b5a-b924-f78e8e962f05","Type":"ContainerStarted","Data":"b7e9af1702ef82173cea933a96e7fb0c4e002678319c1153c1797a44e2014dc7"} Apr 17 20:47:04.187943 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:04.187914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerStarted","Data":"1f4c13a2a13b8b863ba04cd7c9dfdc0363b153734a914ec59cdcbfe9750f3df3"} Apr 17 20:47:05.193736 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.193697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" event={"ID":"d4390455-ccf6-4110-ab32-8ce17b8d9693","Type":"ContainerStarted","Data":"f878a13b61800f461752297d979ac9941e24f2e5d4ecd3836da122c8d4f6446f"} Apr 17 20:47:05.195417 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.195393 2572 generic.go:358] "Generic (PLEG): container finished" podID="6bbfd575-a18e-4b5a-b924-f78e8e962f05" containerID="f1d6694bc28958c528ea902e49c49224547d40304d1ae559ec2f9fe5c7d1aa9d" exitCode=0 Apr 17 20:47:05.195551 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.195482 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hbxqp" event={"ID":"6bbfd575-a18e-4b5a-b924-f78e8e962f05","Type":"ContainerDied","Data":"f1d6694bc28958c528ea902e49c49224547d40304d1ae559ec2f9fe5c7d1aa9d"} Apr 17 20:47:05.197639 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.197617 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" event={"ID":"eac69935-3fd7-4a29-9949-4a6e86df9ae0","Type":"ContainerStarted","Data":"3e1107a67673895e9ae4bae94629bc4fefb4311395f7eb38ab24e78d667cbf4d"} Apr 17 20:47:05.197732 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.197646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" event={"ID":"eac69935-3fd7-4a29-9949-4a6e86df9ae0","Type":"ContainerStarted","Data":"eee4690b73f31a297b4994ca40cced5f5460b15066e3d2b1bb0caf05a8e7c3b4"} Apr 17 20:47:05.197732 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.197660 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" event={"ID":"eac69935-3fd7-4a29-9949-4a6e86df9ae0","Type":"ContainerStarted","Data":"23ab10970e68fa55faced0c663535656cddb655eab0023c5c66e49de1963c959"} Apr 17 20:47:05.210925 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.210872 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cdg4h" podStartSLOduration=1.5365913199999999 podStartE2EDuration="3.210856298s" podCreationTimestamp="2026-04-17 20:47:02 +0000 UTC" firstStartedPulling="2026-04-17 20:47:02.844415041 +0000 UTC m=+174.569682378" lastFinishedPulling="2026-04-17 20:47:04.518680022 +0000 UTC m=+176.243947356" observedRunningTime="2026-04-17 20:47:05.209589452 +0000 UTC m=+176.934856810" watchObservedRunningTime="2026-04-17 20:47:05.210856298 +0000 UTC m=+176.936123660" Apr 17 20:47:05.242883 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.242788 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-cdmls" podStartSLOduration=1.476721132 podStartE2EDuration="3.242771955s" podCreationTimestamp="2026-04-17 20:47:02 +0000 UTC" firstStartedPulling="2026-04-17 20:47:02.752591015 +0000 UTC m=+174.477858351" lastFinishedPulling="2026-04-17 20:47:04.518641839 +0000 UTC m=+176.243909174" observedRunningTime="2026-04-17 20:47:05.242468046 +0000 UTC m=+176.967735407" watchObservedRunningTime="2026-04-17 20:47:05.242771955 +0000 UTC m=+176.968039313" Apr 17 20:47:05.266374 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.266300 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-85dd6fc5-s6bh8"] Apr 17 20:47:05.269619 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.269599 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.271437 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.271417 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 20:47:05.271911 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.271671 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 20:47:05.271911 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.271769 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 20:47:05.272080 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.272062 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 20:47:05.272150 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.272117 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-eol42ptuke9k4\"" Apr 17 20:47:05.272222 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.272064 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 20:47:05.272349 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.272332 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-rt9l7\"" Apr 17 20:47:05.280264 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.280246 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-85dd6fc5-s6bh8"] Apr 17 20:47:05.326341 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.326302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c7ed08-d830-49b4-bf0c-9283732d0744-metrics-client-ca\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.326521 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.326356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.326521 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.326424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-tls\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.326521 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.326503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5ksn\" (UniqueName: \"kubernetes.io/projected/75c7ed08-d830-49b4-bf0c-9283732d0744-kube-api-access-w5ksn\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.326686 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.326584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.326686 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.326613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-grpc-tls\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.326874 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.326842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.326964 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.326874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.428172 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.428135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.428286 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.428178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.428286 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.428245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c7ed08-d830-49b4-bf0c-9283732d0744-metrics-client-ca\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.428286 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.428279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.428434 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.428312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-tls\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.428434 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.428350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5ksn\" (UniqueName: \"kubernetes.io/projected/75c7ed08-d830-49b4-bf0c-9283732d0744-kube-api-access-w5ksn\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.428434 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.428384 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.428434 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.428410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-grpc-tls\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.430554 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.430286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c7ed08-d830-49b4-bf0c-9283732d0744-metrics-client-ca\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.431581 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.431532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.432195 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.431927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.432878 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.432832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-tls\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.433101 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.433068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-grpc-tls\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.433526 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.433506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.433712 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.433685 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75c7ed08-d830-49b4-bf0c-9283732d0744-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.435553 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.435529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5ksn\" (UniqueName: \"kubernetes.io/projected/75c7ed08-d830-49b4-bf0c-9283732d0744-kube-api-access-w5ksn\") pod \"thanos-querier-85dd6fc5-s6bh8\" (UID: \"75c7ed08-d830-49b4-bf0c-9283732d0744\") " pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.582521 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.582482 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:05.701866 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:05.701455 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-85dd6fc5-s6bh8"] Apr 17 20:47:05.704132 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:05.704094 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c7ed08_d830_49b4_bf0c_9283732d0744.slice/crio-26be0cc7a1c7a864203d4ac673db82115cf45fb4f07672ce5cd4ea19c36f796a WatchSource:0}: Error finding container 26be0cc7a1c7a864203d4ac673db82115cf45fb4f07672ce5cd4ea19c36f796a: Status 404 returned error can't find the container with id 26be0cc7a1c7a864203d4ac673db82115cf45fb4f07672ce5cd4ea19c36f796a Apr 17 20:47:06.203365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:06.203329 2572 generic.go:358] "Generic (PLEG): container finished" podID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerID="af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65" exitCode=0 Apr 17 20:47:06.203825 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:06.203417 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerDied","Data":"af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65"} Apr 17 20:47:06.205597 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:06.205566 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hbxqp" event={"ID":"6bbfd575-a18e-4b5a-b924-f78e8e962f05","Type":"ContainerStarted","Data":"79f00890765872f5cd991a88bdf2dd027a0c38f5b12c26d891ac2efa169d1904"} Apr 17 20:47:06.205735 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:06.205615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hbxqp" event={"ID":"6bbfd575-a18e-4b5a-b924-f78e8e962f05","Type":"ContainerStarted","Data":"9fb166d1e5933596225cfd2eebdf415276b26b04d540358d68f95c8c30029651"} Apr 17 20:47:06.206819 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:06.206741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" event={"ID":"75c7ed08-d830-49b4-bf0c-9283732d0744","Type":"ContainerStarted","Data":"26be0cc7a1c7a864203d4ac673db82115cf45fb4f07672ce5cd4ea19c36f796a"} Apr 17 20:47:06.240870 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:06.240778 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hbxqp" podStartSLOduration=2.9359697909999998 podStartE2EDuration="4.240761221s" podCreationTimestamp="2026-04-17 20:47:02 +0000 UTC" firstStartedPulling="2026-04-17 20:47:03.21546149 +0000 UTC m=+174.940728826" lastFinishedPulling="2026-04-17 20:47:04.520252916 +0000 UTC m=+176.245520256" observedRunningTime="2026-04-17 20:47:06.239842633 +0000 UTC m=+177.965109992" watchObservedRunningTime="2026-04-17 20:47:06.240761221 +0000 UTC m=+177.966028579" Apr 17 20:47:07.058564 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.058530 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j"] Apr 17 20:47:07.060424 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.060410 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" Apr 17 20:47:07.062270 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.062245 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 20:47:07.062420 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.062403 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-sspg4\"" Apr 17 20:47:07.069470 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.069362 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j"] Apr 17 20:47:07.145209 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.145170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ad299635-2784-4210-a8f7-45838c9eab4a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7x6j\" (UID: \"ad299635-2784-4210-a8f7-45838c9eab4a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" Apr 17 20:47:07.246930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.246607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ad299635-2784-4210-a8f7-45838c9eab4a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7x6j\" (UID: \"ad299635-2784-4210-a8f7-45838c9eab4a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" Apr 17 20:47:07.246930 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:47:07.246770 2572 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 20:47:07.246930 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:47:07.246859 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad299635-2784-4210-a8f7-45838c9eab4a-monitoring-plugin-cert podName:ad299635-2784-4210-a8f7-45838c9eab4a nodeName:}" failed. No retries permitted until 2026-04-17 20:47:07.746838931 +0000 UTC m=+179.472106270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/ad299635-2784-4210-a8f7-45838c9eab4a-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-j7x6j" (UID: "ad299635-2784-4210-a8f7-45838c9eab4a") : secret "monitoring-plugin-cert" not found Apr 17 20:47:07.752039 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.752005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ad299635-2784-4210-a8f7-45838c9eab4a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7x6j\" (UID: \"ad299635-2784-4210-a8f7-45838c9eab4a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" Apr 17 20:47:07.754652 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.754629 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ad299635-2784-4210-a8f7-45838c9eab4a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7x6j\" (UID: \"ad299635-2784-4210-a8f7-45838c9eab4a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" Apr 17 20:47:07.971445 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:07.971414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" Apr 17 20:47:08.215077 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:08.215045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" event={"ID":"75c7ed08-d830-49b4-bf0c-9283732d0744","Type":"ContainerStarted","Data":"ebe9aef3ca41e4bfec57b9136d8f90c29bb45a2b0a1ea0f7e1129fe68315dc26"} Apr 17 20:47:08.218137 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:08.218078 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerStarted","Data":"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807"} Apr 17 20:47:08.251113 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:08.250964 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j"] Apr 17 20:47:08.256643 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:08.256603 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad299635_2784_4210_a8f7_45838c9eab4a.slice/crio-4efdf8c985654f2b32b7ad217294f3f3b94ec7272f64de127ccbd3e7132974f1 WatchSource:0}: Error finding container 4efdf8c985654f2b32b7ad217294f3f3b94ec7272f64de127ccbd3e7132974f1: Status 404 returned error can't find the container with id 4efdf8c985654f2b32b7ad217294f3f3b94ec7272f64de127ccbd3e7132974f1 Apr 17 20:47:09.222684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:09.222646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" event={"ID":"75c7ed08-d830-49b4-bf0c-9283732d0744","Type":"ContainerStarted","Data":"f47a905da1e407dad0a8faa05b71dcb7754707d1e8de7eae627cd2fa79b5fcfa"} Apr 17 20:47:09.222684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:09.222687 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" event={"ID":"75c7ed08-d830-49b4-bf0c-9283732d0744","Type":"ContainerStarted","Data":"78042cf47ccdc6e5cdd9696b050167d9f21913538a44191d752861598845f3dd"} Apr 17 20:47:09.223782 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:09.223752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" event={"ID":"ad299635-2784-4210-a8f7-45838c9eab4a","Type":"ContainerStarted","Data":"4efdf8c985654f2b32b7ad217294f3f3b94ec7272f64de127ccbd3e7132974f1"} Apr 17 20:47:09.226223 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:09.226184 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerStarted","Data":"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1"} Apr 17 20:47:09.226223 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:09.226216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerStarted","Data":"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416"} Apr 17 20:47:09.226223 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:09.226227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerStarted","Data":"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08"} Apr 17 20:47:09.226388 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:09.226236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerStarted","Data":"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f"} Apr 17 20:47:10.231775 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.231738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" event={"ID":"75c7ed08-d830-49b4-bf0c-9283732d0744","Type":"ContainerStarted","Data":"1fc92239b6593a3bb04b3b57935d95da9c75416875f3892ae6dfd05c6bc77f1a"} Apr 17 20:47:10.231775 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.231772 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" event={"ID":"75c7ed08-d830-49b4-bf0c-9283732d0744","Type":"ContainerStarted","Data":"797dd67fd0211573ce0ebbaa3a04918dd755a0803276bf078cb5475417a08332"} Apr 17 20:47:10.232242 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.231783 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" event={"ID":"75c7ed08-d830-49b4-bf0c-9283732d0744","Type":"ContainerStarted","Data":"743d353f54af50f5704d3201350a4ebf566eb643ef3468c6974fd71deee67f27"} Apr 17 20:47:10.232242 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.231926 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:10.233172 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.233152 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" event={"ID":"ad299635-2784-4210-a8f7-45838c9eab4a","Type":"ContainerStarted","Data":"378bc52f18abb2e2ad2dc6049fcf650f5493e466e55bfab2873335008b6f619a"} Apr 17 20:47:10.233333 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.233319 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" Apr 17 20:47:10.236264 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.236241 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerStarted","Data":"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16"} Apr 17 20:47:10.238404 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.238389 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" Apr 17 20:47:10.252525 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.252431 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" podStartSLOduration=1.354739864 podStartE2EDuration="5.252418972s" podCreationTimestamp="2026-04-17 20:47:05 +0000 UTC" firstStartedPulling="2026-04-17 20:47:05.705916204 +0000 UTC m=+177.431183542" lastFinishedPulling="2026-04-17 20:47:09.603595316 +0000 UTC m=+181.328862650" observedRunningTime="2026-04-17 20:47:10.251357939 +0000 UTC m=+181.976625296" watchObservedRunningTime="2026-04-17 20:47:10.252418972 +0000 UTC m=+181.977686330" Apr 17 20:47:10.264342 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.264307 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7x6j" podStartSLOduration=1.919491688 podStartE2EDuration="3.264297257s" podCreationTimestamp="2026-04-17 20:47:07 +0000 UTC" firstStartedPulling="2026-04-17 20:47:08.258708912 +0000 UTC m=+179.983976247" lastFinishedPulling="2026-04-17 20:47:09.603514481 +0000 UTC m=+181.328781816" observedRunningTime="2026-04-17 20:47:10.262871761 +0000 UTC m=+181.988139130" watchObservedRunningTime="2026-04-17 20:47:10.264297257 +0000 UTC m=+181.989564614" Apr 17 20:47:10.285491 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:10.285457 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.679110721 podStartE2EDuration="7.285447167s" podCreationTimestamp="2026-04-17 20:47:03 +0000 UTC" firstStartedPulling="2026-04-17 20:47:03.998422368 +0000 UTC m=+175.723689702" lastFinishedPulling="2026-04-17 20:47:09.604758807 +0000 UTC m=+181.330026148" observedRunningTime="2026-04-17 20:47:10.284419826 +0000 UTC m=+182.009687186" watchObservedRunningTime="2026-04-17 20:47:10.285447167 +0000 UTC m=+182.010714523" Apr 17 20:47:16.247305 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:16.247279 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-85dd6fc5-s6bh8" Apr 17 20:47:19.304601 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.304572 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-ddvr7"] Apr 17 20:47:19.307877 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.307859 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ddvr7" Apr 17 20:47:19.309847 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.309819 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 20:47:19.309959 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.309865 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 20:47:19.310148 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.310134 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-62mgm\"" Apr 17 20:47:19.316636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.316614 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ddvr7"] Apr 17 20:47:19.446435 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.446402 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2lq\" (UniqueName: \"kubernetes.io/projected/bcde0453-95f9-4d89-8211-99b7ac7f0b68-kube-api-access-rn2lq\") pod \"downloads-6bcc868b7-ddvr7\" (UID: \"bcde0453-95f9-4d89-8211-99b7ac7f0b68\") " pod="openshift-console/downloads-6bcc868b7-ddvr7" Apr 17 20:47:19.547114 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.547090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2lq\" (UniqueName: \"kubernetes.io/projected/bcde0453-95f9-4d89-8211-99b7ac7f0b68-kube-api-access-rn2lq\") pod \"downloads-6bcc868b7-ddvr7\" (UID: \"bcde0453-95f9-4d89-8211-99b7ac7f0b68\") " pod="openshift-console/downloads-6bcc868b7-ddvr7" Apr 17 20:47:19.553974 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.553950 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2lq\" (UniqueName: \"kubernetes.io/projected/bcde0453-95f9-4d89-8211-99b7ac7f0b68-kube-api-access-rn2lq\") pod \"downloads-6bcc868b7-ddvr7\" (UID: \"bcde0453-95f9-4d89-8211-99b7ac7f0b68\") " pod="openshift-console/downloads-6bcc868b7-ddvr7" Apr 17 20:47:19.617441 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.617377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ddvr7" Apr 17 20:47:19.733556 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:19.733533 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ddvr7"] Apr 17 20:47:19.735848 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:19.735816 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcde0453_95f9_4d89_8211_99b7ac7f0b68.slice/crio-0900ad19ce22a18234535234a01e3a6636458cb4d8fce59ae376b02cb67f3259 WatchSource:0}: Error finding container 0900ad19ce22a18234535234a01e3a6636458cb4d8fce59ae376b02cb67f3259: Status 404 returned error can't find the container with id 0900ad19ce22a18234535234a01e3a6636458cb4d8fce59ae376b02cb67f3259 Apr 17 20:47:20.267244 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:20.267208 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ddvr7" event={"ID":"bcde0453-95f9-4d89-8211-99b7ac7f0b68","Type":"ContainerStarted","Data":"0900ad19ce22a18234535234a01e3a6636458cb4d8fce59ae376b02cb67f3259"} Apr 17 20:47:30.225261 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.225219 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-554bf89658-ctcmr"] Apr 17 20:47:30.231765 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.231734 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.234425 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.234394 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 20:47:30.234425 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.234400 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 20:47:30.234425 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.234387 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bv9v2\"" Apr 17 20:47:30.234662 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.234565 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 20:47:30.234662 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.234566 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 20:47:30.234759 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.234732 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 20:47:30.239893 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.239861 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-554bf89658-ctcmr"] Apr 17 20:47:30.348087 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.348048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2gfv\" (UniqueName: \"kubernetes.io/projected/56449056-1002-4874-a19c-270abba8a1a7-kube-api-access-k2gfv\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.348279 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.348108 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-console-config\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.348279 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.348197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-serving-cert\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.348363 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.348309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-service-ca\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.348363 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.348341 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-oauth-serving-cert\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.348495 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.348411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-oauth-config\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.449120 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.449086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2gfv\" (UniqueName: \"kubernetes.io/projected/56449056-1002-4874-a19c-270abba8a1a7-kube-api-access-k2gfv\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.449313 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.449138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-console-config\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.449313 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.449179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-serving-cert\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.449313 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.449251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-service-ca\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.449313 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.449309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-oauth-serving-cert\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.449518 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.449352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-oauth-config\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.450244 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.450203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-service-ca\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.450356 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.450261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-oauth-serving-cert\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.450712 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.450688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-console-config\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.452026 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.452002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-oauth-config\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.452130 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.452078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-serving-cert\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.456666 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.456645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2gfv\" (UniqueName: \"kubernetes.io/projected/56449056-1002-4874-a19c-270abba8a1a7-kube-api-access-k2gfv\") pod \"console-554bf89658-ctcmr\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.545734 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.545695 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:30.699054 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:30.699027 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-554bf89658-ctcmr"] Apr 17 20:47:30.701526 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:30.701484 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56449056_1002_4874_a19c_270abba8a1a7.slice/crio-9be5c3ace49cf97ed1525ddfd5bc6479a3941f68b2d066f0304a567d777d5076 WatchSource:0}: Error finding container 9be5c3ace49cf97ed1525ddfd5bc6479a3941f68b2d066f0304a567d777d5076: Status 404 returned error can't find the container with id 9be5c3ace49cf97ed1525ddfd5bc6479a3941f68b2d066f0304a567d777d5076 Apr 17 20:47:31.303665 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:31.303631 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554bf89658-ctcmr" event={"ID":"56449056-1002-4874-a19c-270abba8a1a7","Type":"ContainerStarted","Data":"9be5c3ace49cf97ed1525ddfd5bc6479a3941f68b2d066f0304a567d777d5076"} Apr 17 20:47:39.009283 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.009247 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85ff8d8c7d-vhgzp"] Apr 17 20:47:39.014715 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.014691 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.021720 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.021670 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 20:47:39.022945 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.022921 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85ff8d8c7d-vhgzp"] Apr 17 20:47:39.131298 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.131278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-serving-cert\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.131375 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.131334 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-config\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.131429 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.131381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-oauth-serving-cert\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.131429 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.131405 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-oauth-config\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.131541 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.131426 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-trusted-ca-bundle\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.131592 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.131556 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-service-ca\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.131645 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.131595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdvvm\" (UniqueName: \"kubernetes.io/projected/1a20d1f3-710a-487c-82a5-dca6d37f57c1-kube-api-access-rdvvm\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.232254 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.232224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdvvm\" (UniqueName: \"kubernetes.io/projected/1a20d1f3-710a-487c-82a5-dca6d37f57c1-kube-api-access-rdvvm\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.232401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.232272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-serving-cert\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.232401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.232303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-config\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.232401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.232332 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-oauth-serving-cert\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.232401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.232353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-oauth-config\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.232401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.232373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-trusted-ca-bundle\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.232401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.232400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-service-ca\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.233102 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.233078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-service-ca\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.233102 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.233089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-oauth-serving-cert\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.233513 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.233488 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-config\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.233513 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.233515 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-trusted-ca-bundle\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.235002 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.234979 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-oauth-config\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.235112 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.235002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-serving-cert\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.239404 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.239368 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdvvm\" (UniqueName: \"kubernetes.io/projected/1a20d1f3-710a-487c-82a5-dca6d37f57c1-kube-api-access-rdvvm\") pod \"console-85ff8d8c7d-vhgzp\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.329196 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.329162 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:39.330880 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.330835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ddvr7" event={"ID":"bcde0453-95f9-4d89-8211-99b7ac7f0b68","Type":"ContainerStarted","Data":"12f4afb201de5c20b4ba5b27aeb3eee23f16fa5bdcc0fa4cee369c0cd5eb395f"} Apr 17 20:47:39.331103 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.331081 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-ddvr7" Apr 17 20:47:39.332572 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.332544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554bf89658-ctcmr" event={"ID":"56449056-1002-4874-a19c-270abba8a1a7","Type":"ContainerStarted","Data":"9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa"} Apr 17 20:47:39.339754 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.339707 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-ddvr7" Apr 17 20:47:39.346183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.346143 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-ddvr7" podStartSLOduration=0.957017717 podStartE2EDuration="20.34613081s" podCreationTimestamp="2026-04-17 20:47:19 +0000 UTC" firstStartedPulling="2026-04-17 20:47:19.737754 +0000 UTC m=+191.463021350" lastFinishedPulling="2026-04-17 20:47:39.126867101 +0000 UTC m=+210.852134443" observedRunningTime="2026-04-17 20:47:39.344351472 +0000 UTC m=+211.069618832" watchObservedRunningTime="2026-04-17 20:47:39.34613081 +0000 UTC m=+211.071398167" Apr 17 20:47:39.374216 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.374157 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-554bf89658-ctcmr" podStartSLOduration=0.991963748 podStartE2EDuration="9.374138972s" podCreationTimestamp="2026-04-17 20:47:30 +0000 UTC" firstStartedPulling="2026-04-17 20:47:30.70404176 +0000 UTC m=+202.429309108" lastFinishedPulling="2026-04-17 20:47:39.086216995 +0000 UTC m=+210.811484332" observedRunningTime="2026-04-17 20:47:39.37360886 +0000 UTC m=+211.098876231" watchObservedRunningTime="2026-04-17 20:47:39.374138972 +0000 UTC m=+211.099406333" Apr 17 20:47:39.476275 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:39.476244 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85ff8d8c7d-vhgzp"] Apr 17 20:47:39.479457 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:47:39.479426 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a20d1f3_710a_487c_82a5_dca6d37f57c1.slice/crio-8cb8780fa8b870ec48bc7a1d9e62c9f7d3a0b457390119866ce5683b39ef1be5 WatchSource:0}: Error finding container 8cb8780fa8b870ec48bc7a1d9e62c9f7d3a0b457390119866ce5683b39ef1be5: Status 404 returned error can't find the container with id 8cb8780fa8b870ec48bc7a1d9e62c9f7d3a0b457390119866ce5683b39ef1be5 Apr 17 20:47:40.337658 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:40.337613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85ff8d8c7d-vhgzp" event={"ID":"1a20d1f3-710a-487c-82a5-dca6d37f57c1","Type":"ContainerStarted","Data":"5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68"} Apr 17 20:47:40.337658 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:40.337664 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85ff8d8c7d-vhgzp" event={"ID":"1a20d1f3-710a-487c-82a5-dca6d37f57c1","Type":"ContainerStarted","Data":"8cb8780fa8b870ec48bc7a1d9e62c9f7d3a0b457390119866ce5683b39ef1be5"} Apr 17 20:47:40.352773 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:40.352715 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85ff8d8c7d-vhgzp" podStartSLOduration=2.352692563 podStartE2EDuration="2.352692563s" podCreationTimestamp="2026-04-17 20:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:47:40.352319286 +0000 UTC m=+212.077586658" watchObservedRunningTime="2026-04-17 20:47:40.352692563 +0000 UTC m=+212.077959921" Apr 17 20:47:40.546911 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:40.546872 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:40.547089 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:40.546920 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:40.552185 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:40.552162 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:41.345989 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:41.345961 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:47:49.329846 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:49.329790 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:49.330472 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:49.329861 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:49.335368 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:49.335348 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:49.372629 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:49.372606 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:47:49.412968 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:47:49.412939 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-554bf89658-ctcmr"] Apr 17 20:48:14.432868 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.432827 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-554bf89658-ctcmr" podUID="56449056-1002-4874-a19c-270abba8a1a7" containerName="console" containerID="cri-o://9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa" gracePeriod=15 Apr 17 20:48:14.761312 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.761286 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-554bf89658-ctcmr_56449056-1002-4874-a19c-270abba8a1a7/console/0.log" Apr 17 20:48:14.761448 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.761344 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:48:14.945121 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.945087 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-serving-cert\") pod \"56449056-1002-4874-a19c-270abba8a1a7\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " Apr 17 20:48:14.945395 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.945378 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-service-ca\") pod \"56449056-1002-4874-a19c-270abba8a1a7\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " Apr 17 20:48:14.945891 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.945863 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-oauth-serving-cert\") pod \"56449056-1002-4874-a19c-270abba8a1a7\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " Apr 17 20:48:14.946441 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.945771 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-service-ca" (OuterVolumeSpecName: "service-ca") pod "56449056-1002-4874-a19c-270abba8a1a7" (UID: "56449056-1002-4874-a19c-270abba8a1a7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:48:14.946441 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.946330 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "56449056-1002-4874-a19c-270abba8a1a7" (UID: "56449056-1002-4874-a19c-270abba8a1a7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:48:14.946640 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.946625 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-console-config\") pod \"56449056-1002-4874-a19c-270abba8a1a7\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " Apr 17 20:48:14.947089 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.946975 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-console-config" (OuterVolumeSpecName: "console-config") pod "56449056-1002-4874-a19c-270abba8a1a7" (UID: "56449056-1002-4874-a19c-270abba8a1a7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:48:14.947279 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.947264 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2gfv\" (UniqueName: \"kubernetes.io/projected/56449056-1002-4874-a19c-270abba8a1a7-kube-api-access-k2gfv\") pod \"56449056-1002-4874-a19c-270abba8a1a7\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " Apr 17 20:48:14.947683 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.947667 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-oauth-config\") pod \"56449056-1002-4874-a19c-270abba8a1a7\" (UID: \"56449056-1002-4874-a19c-270abba8a1a7\") " Apr 17 20:48:14.948353 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.948334 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-console-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:14.948479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.948465 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-service-ca\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:14.948578 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.948566 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56449056-1002-4874-a19c-270abba8a1a7-oauth-serving-cert\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:14.953236 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.953209 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "56449056-1002-4874-a19c-270abba8a1a7" (UID: "56449056-1002-4874-a19c-270abba8a1a7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:14.953236 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.953219 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56449056-1002-4874-a19c-270abba8a1a7-kube-api-access-k2gfv" (OuterVolumeSpecName: "kube-api-access-k2gfv") pod "56449056-1002-4874-a19c-270abba8a1a7" (UID: "56449056-1002-4874-a19c-270abba8a1a7"). InnerVolumeSpecName "kube-api-access-k2gfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:48:14.953398 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:14.953239 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "56449056-1002-4874-a19c-270abba8a1a7" (UID: "56449056-1002-4874-a19c-270abba8a1a7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:15.049563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.049474 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k2gfv\" (UniqueName: \"kubernetes.io/projected/56449056-1002-4874-a19c-270abba8a1a7-kube-api-access-k2gfv\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:15.049563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.049513 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-oauth-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:15.049563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.049530 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56449056-1002-4874-a19c-270abba8a1a7-console-serving-cert\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:15.445984 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.445911 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-554bf89658-ctcmr_56449056-1002-4874-a19c-270abba8a1a7/console/0.log" Apr 17 20:48:15.445984 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.445955 2572 generic.go:358] "Generic (PLEG): container finished" podID="56449056-1002-4874-a19c-270abba8a1a7" containerID="9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa" exitCode=2 Apr 17 20:48:15.446450 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.446008 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554bf89658-ctcmr" event={"ID":"56449056-1002-4874-a19c-270abba8a1a7","Type":"ContainerDied","Data":"9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa"} Apr 17 20:48:15.446450 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.446018 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554bf89658-ctcmr" Apr 17 20:48:15.446450 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.446043 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554bf89658-ctcmr" event={"ID":"56449056-1002-4874-a19c-270abba8a1a7","Type":"ContainerDied","Data":"9be5c3ace49cf97ed1525ddfd5bc6479a3941f68b2d066f0304a567d777d5076"} Apr 17 20:48:15.446450 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.446063 2572 scope.go:117] "RemoveContainer" containerID="9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa" Apr 17 20:48:15.470476 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.470451 2572 scope.go:117] "RemoveContainer" containerID="9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa" Apr 17 20:48:15.470741 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.470704 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-554bf89658-ctcmr"] Apr 17 20:48:15.470945 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:48:15.470916 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa\": container with ID starting with 9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa not found: ID does not exist" containerID="9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa" Apr 17 20:48:15.471041 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.470957 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa"} err="failed to get container status \"9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa\": rpc error: code = NotFound desc = could not find container \"9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa\": container with ID starting with 9cb21e26c51887d8357fbf9bdfa0e2efde8d7a20d856da915f606c5c100030fa not found: ID does not exist" Apr 17 20:48:15.473257 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:15.473235 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-554bf89658-ctcmr"] Apr 17 20:48:16.712946 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:16.712912 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56449056-1002-4874-a19c-270abba8a1a7" path="/var/lib/kubelet/pods/56449056-1002-4874-a19c-270abba8a1a7/volumes" Apr 17 20:48:20.491150 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:20.491074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:48:20.493343 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:20.493312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83681f84-53f3-489d-9b30-0db22fc1b40e-metrics-certs\") pod \"network-metrics-daemon-gggxp\" (UID: \"83681f84-53f3-489d-9b30-0db22fc1b40e\") " pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:48:20.711031 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:20.711002 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jzn7k\"" Apr 17 20:48:20.719429 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:20.719408 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gggxp" Apr 17 20:48:20.834224 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:20.834183 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gggxp"] Apr 17 20:48:20.836852 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:48:20.836822 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83681f84_53f3_489d_9b30_0db22fc1b40e.slice/crio-2a9b2ac9a23684d9812bb36b0d47edeabc388514e2c616b9a201dfb8744be0b2 WatchSource:0}: Error finding container 2a9b2ac9a23684d9812bb36b0d47edeabc388514e2c616b9a201dfb8744be0b2: Status 404 returned error can't find the container with id 2a9b2ac9a23684d9812bb36b0d47edeabc388514e2c616b9a201dfb8744be0b2 Apr 17 20:48:21.467877 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:21.467795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gggxp" event={"ID":"83681f84-53f3-489d-9b30-0db22fc1b40e","Type":"ContainerStarted","Data":"2a9b2ac9a23684d9812bb36b0d47edeabc388514e2c616b9a201dfb8744be0b2"} Apr 17 20:48:22.495699 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:22.495667 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:48:22.496159 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:22.496135 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="alertmanager" containerID="cri-o://d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807" gracePeriod=120 Apr 17 20:48:22.496214 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:22.496190 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy-metric" containerID="cri-o://30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1" gracePeriod=120 Apr 17 20:48:22.496267 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:22.496211 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="prom-label-proxy" containerID="cri-o://f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16" gracePeriod=120 Apr 17 20:48:22.496267 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:22.496230 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy-web" containerID="cri-o://c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08" gracePeriod=120 Apr 17 20:48:22.496358 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:22.496279 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy" containerID="cri-o://de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416" gracePeriod=120 Apr 17 20:48:22.496405 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:22.496271 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="config-reloader" containerID="cri-o://0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f" gracePeriod=120 Apr 17 20:48:23.475400 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.475364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gggxp" event={"ID":"83681f84-53f3-489d-9b30-0db22fc1b40e","Type":"ContainerStarted","Data":"31f36b0d1f4ca6c4a5a21bfe4460decdbf2c6541761eb51761ce7594a51e2c4c"} Apr 17 20:48:23.475598 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.475406 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gggxp" event={"ID":"83681f84-53f3-489d-9b30-0db22fc1b40e","Type":"ContainerStarted","Data":"860bd0966e30e93434084ec3f35767ae132a09574aefaa4f8e05f60ad518c629"} Apr 17 20:48:23.478104 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.478076 2572 generic.go:358] "Generic (PLEG): container finished" podID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerID="f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16" exitCode=0 Apr 17 20:48:23.478104 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.478101 2572 generic.go:358] "Generic (PLEG): container finished" podID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerID="de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416" exitCode=0 Apr 17 20:48:23.478104 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.478108 2572 generic.go:358] "Generic (PLEG): container finished" podID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerID="0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f" exitCode=0 Apr 17 20:48:23.478379 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.478113 2572 generic.go:358] "Generic (PLEG): container finished" podID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerID="d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807" exitCode=0 Apr 17 20:48:23.478379 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.478153 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerDied","Data":"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16"} Apr 17 20:48:23.478379 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.478181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerDied","Data":"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416"} Apr 17 20:48:23.478379 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.478192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerDied","Data":"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f"} Apr 17 20:48:23.478379 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.478202 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerDied","Data":"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807"} Apr 17 20:48:23.489317 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:23.489275 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gggxp" podStartSLOduration=253.818501152 podStartE2EDuration="4m15.489263492s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:48:20.838620754 +0000 UTC m=+252.563888094" lastFinishedPulling="2026-04-17 20:48:22.509383098 +0000 UTC m=+254.234650434" observedRunningTime="2026-04-17 20:48:23.487723299 +0000 UTC m=+255.212990653" watchObservedRunningTime="2026-04-17 20:48:23.489263492 +0000 UTC m=+255.214530848" Apr 17 20:48:24.246424 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.246402 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.323047 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323016 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-metrics-client-ca\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323047 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323052 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-main-tls\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323233 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323068 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-volume\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323233 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323101 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-tls-assets\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323233 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323125 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-web-config\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323233 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323141 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323233 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323184 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323484 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323242 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-cluster-tls-config\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323484 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323275 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-out\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323484 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323306 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nftd\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-kube-api-access-4nftd\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323484 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323337 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-trusted-ca-bundle\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323484 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323369 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323484 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323418 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-main-db\") pod \"de33ee91-814f-4e24-92a9-20b1ec5751a9\" (UID: \"de33ee91-814f-4e24-92a9-20b1ec5751a9\") " Apr 17 20:48:24.323484 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323410 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:48:24.323841 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323647 2572 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-metrics-client-ca\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.324100 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.323953 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:48:24.324284 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.324254 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:48:24.325818 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.325775 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:24.326217 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.326177 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:24.326436 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.326404 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:24.326732 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.326695 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:24.326876 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.326850 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:24.326953 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.326890 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:48:24.327098 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.327073 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-out" (OuterVolumeSpecName: "config-out") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:48:24.327620 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.327600 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-kube-api-access-4nftd" (OuterVolumeSpecName: "kube-api-access-4nftd") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "kube-api-access-4nftd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:48:24.330221 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.330199 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:24.337777 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.337757 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-web-config" (OuterVolumeSpecName: "web-config") pod "de33ee91-814f-4e24-92a9-20b1ec5751a9" (UID: "de33ee91-814f-4e24-92a9-20b1ec5751a9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:48:24.424285 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424238 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-main-tls\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424285 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424256 2572 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-volume\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424285 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424265 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-tls-assets\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424285 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424274 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-web-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424285 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424283 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424294 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424304 2572 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-cluster-tls-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424314 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-config-out\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424323 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4nftd\" (UniqueName: \"kubernetes.io/projected/de33ee91-814f-4e24-92a9-20b1ec5751a9-kube-api-access-4nftd\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424331 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424341 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de33ee91-814f-4e24-92a9-20b1ec5751a9-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.424479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.424350 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de33ee91-814f-4e24-92a9-20b1ec5751a9-alertmanager-main-db\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:48:24.483091 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.483069 2572 generic.go:358] "Generic (PLEG): container finished" podID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerID="30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1" exitCode=0 Apr 17 20:48:24.483091 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.483088 2572 generic.go:358] "Generic (PLEG): container finished" podID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerID="c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08" exitCode=0 Apr 17 20:48:24.483216 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.483171 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.483273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.483163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerDied","Data":"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1"} Apr 17 20:48:24.483306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.483287 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerDied","Data":"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08"} Apr 17 20:48:24.483306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.483299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de33ee91-814f-4e24-92a9-20b1ec5751a9","Type":"ContainerDied","Data":"1f4c13a2a13b8b863ba04cd7c9dfdc0363b153734a914ec59cdcbfe9750f3df3"} Apr 17 20:48:24.483369 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.483314 2572 scope.go:117] "RemoveContainer" containerID="f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16" Apr 17 20:48:24.490547 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.490457 2572 scope.go:117] "RemoveContainer" containerID="30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1" Apr 17 20:48:24.497168 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.497151 2572 scope.go:117] "RemoveContainer" containerID="de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416" Apr 17 20:48:24.504321 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.504302 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:48:24.506462 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.506443 2572 scope.go:117] "RemoveContainer" containerID="c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08" Apr 17 20:48:24.515175 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.515153 2572 scope.go:117] "RemoveContainer" containerID="0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f" Apr 17 20:48:24.517383 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.517364 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:48:24.521870 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.521844 2572 scope.go:117] "RemoveContainer" containerID="d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807" Apr 17 20:48:24.528730 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.528714 2572 scope.go:117] "RemoveContainer" containerID="af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65" Apr 17 20:48:24.536097 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536040 2572 scope.go:117] "RemoveContainer" containerID="f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16" Apr 17 20:48:24.536181 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536140 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:48:24.536361 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:48:24.536343 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16\": container with ID starting with f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16 not found: ID does not exist" containerID="f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16" Apr 17 20:48:24.536397 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536368 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16"} err="failed to get container status \"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16\": rpc error: code = NotFound desc = could not find container \"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16\": container with ID starting with f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16 not found: ID does not exist" Apr 17 20:48:24.536397 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536385 2572 scope.go:117] "RemoveContainer" containerID="30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1" Apr 17 20:48:24.536479 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536462 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="alertmanager" Apr 17 20:48:24.536524 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536477 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="alertmanager" Apr 17 20:48:24.536524 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536488 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="prom-label-proxy" Apr 17 20:48:24.536524 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536497 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="prom-label-proxy" Apr 17 20:48:24.536524 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536513 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="init-config-reloader" Apr 17 20:48:24.536524 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536521 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="init-config-reloader" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536528 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536533 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536539 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy-metric" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536544 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy-metric" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536556 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy-web" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536564 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy-web" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536576 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56449056-1002-4874-a19c-270abba8a1a7" containerName="console" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536585 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="56449056-1002-4874-a19c-270abba8a1a7" containerName="console" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536596 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="config-reloader" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:48:24.536594 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1\": container with ID starting with 30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1 not found: ID does not exist" containerID="30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536623 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1"} err="failed to get container status \"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1\": rpc error: code = NotFound desc = could not find container \"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1\": container with ID starting with 30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1 not found: ID does not exist" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536638 2572 scope.go:117] "RemoveContainer" containerID="de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416" Apr 17 20:48:24.536684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536603 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="config-reloader" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536738 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="config-reloader" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536755 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy-metric" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536766 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy-web" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536774 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="56449056-1002-4874-a19c-270abba8a1a7" containerName="console" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536784 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="alertmanager" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536794 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="kube-rbac-proxy" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536823 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" containerName="prom-label-proxy" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:48:24.536875 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416\": container with ID starting with de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416 not found: ID does not exist" containerID="de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536893 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416"} err="failed to get container status \"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416\": rpc error: code = NotFound desc = could not find container \"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416\": container with ID starting with de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416 not found: ID does not exist" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.536912 2572 scope.go:117] "RemoveContainer" containerID="c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:48:24.537110 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08\": container with ID starting with c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08 not found: ID does not exist" containerID="c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.537127 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08"} err="failed to get container status \"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08\": rpc error: code = NotFound desc = could not find container \"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08\": container with ID starting with c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08 not found: ID does not exist" Apr 17 20:48:24.537183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.537144 2572 scope.go:117] "RemoveContainer" containerID="0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f" Apr 17 20:48:24.537947 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:48:24.537845 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f\": container with ID starting with 0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f not found: ID does not exist" containerID="0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f" Apr 17 20:48:24.537947 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.537876 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f"} err="failed to get container status \"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f\": rpc error: code = NotFound desc = could not find container \"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f\": container with ID starting with 0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f not found: ID does not exist" Apr 17 20:48:24.537947 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.537897 2572 scope.go:117] "RemoveContainer" containerID="d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807" Apr 17 20:48:24.539505 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:48:24.539485 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807\": container with ID starting with d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807 not found: ID does not exist" containerID="d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807" Apr 17 20:48:24.539603 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.539508 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807"} err="failed to get container status \"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807\": rpc error: code = NotFound desc = could not find container \"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807\": container with ID starting with d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807 not found: ID does not exist" Apr 17 20:48:24.539603 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.539524 2572 scope.go:117] "RemoveContainer" containerID="af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65" Apr 17 20:48:24.539826 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:48:24.539784 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65\": container with ID starting with af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65 not found: ID does not exist" containerID="af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65" Apr 17 20:48:24.539906 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.539834 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65"} err="failed to get container status \"af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65\": rpc error: code = NotFound desc = could not find container \"af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65\": container with ID starting with af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65 not found: ID does not exist" Apr 17 20:48:24.539906 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.539854 2572 scope.go:117] "RemoveContainer" containerID="f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16" Apr 17 20:48:24.540099 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540077 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16"} err="failed to get container status \"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16\": rpc error: code = NotFound desc = could not find container \"f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16\": container with ID starting with f9329399046eafab0b8f72a788974d9f072431093cdf25956b93c470a924ea16 not found: ID does not exist" Apr 17 20:48:24.540141 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540101 2572 scope.go:117] "RemoveContainer" containerID="30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1" Apr 17 20:48:24.540287 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540273 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1"} err="failed to get container status \"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1\": rpc error: code = NotFound desc = could not find container \"30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1\": container with ID starting with 30dbeffb7e1901be47f9e9f94efa37eb4932907477a0d5bfa31f2471bfa0d3b1 not found: ID does not exist" Apr 17 20:48:24.540328 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540287 2572 scope.go:117] "RemoveContainer" containerID="de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416" Apr 17 20:48:24.540475 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540458 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416"} err="failed to get container status \"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416\": rpc error: code = NotFound desc = could not find container \"de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416\": container with ID starting with de87e6c4ff33aab515ca0aea16c0997efccce6d5131228810c65c7bf5829c416 not found: ID does not exist" Apr 17 20:48:24.540522 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540475 2572 scope.go:117] "RemoveContainer" containerID="c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08" Apr 17 20:48:24.540660 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540645 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08"} err="failed to get container status \"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08\": rpc error: code = NotFound desc = could not find container \"c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08\": container with ID starting with c3fff7648ac65c0d81ada58422b491296606f3f96411ed8e2807fc40a4007c08 not found: ID does not exist" Apr 17 20:48:24.540703 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540660 2572 scope.go:117] "RemoveContainer" containerID="0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f" Apr 17 20:48:24.540870 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540848 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f"} err="failed to get container status \"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f\": rpc error: code = NotFound desc = could not find container \"0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f\": container with ID starting with 0dc73486e92a01a4e0ac8f5535296208de4624ad6064b99085ae292b2767f68f not found: ID does not exist" Apr 17 20:48:24.540870 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.540867 2572 scope.go:117] "RemoveContainer" containerID="d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807" Apr 17 20:48:24.541085 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.541068 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807"} err="failed to get container status \"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807\": rpc error: code = NotFound desc = could not find container \"d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807\": container with ID starting with d5d8e9f2b6bd4af61beda503fbcc5cbb4fa87e39075089872249cf771d3fc807 not found: ID does not exist" Apr 17 20:48:24.541133 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.541086 2572 scope.go:117] "RemoveContainer" containerID="af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65" Apr 17 20:48:24.541331 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.541312 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65"} err="failed to get container status \"af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65\": rpc error: code = NotFound desc = could not find container \"af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65\": container with ID starting with af2d7f66c5fd3d5818fb6c4da0f9e8789bfd778aff8261fb911b590b53810b65 not found: ID does not exist" Apr 17 20:48:24.544789 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.544775 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.546823 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.546789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 20:48:24.547073 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.547049 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 20:48:24.547160 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.547100 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 20:48:24.547160 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.547133 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 20:48:24.547268 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.547233 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 20:48:24.547535 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.547504 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 20:48:24.547717 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.547677 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 20:48:24.547717 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.547706 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 20:48:24.548105 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.548061 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ggvr5\"" Apr 17 20:48:24.550086 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.550058 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:48:24.553402 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.553381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 20:48:24.625985 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.625962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626082 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626001 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/352184b8-c491-4213-8df2-6b0459566690-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626082 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626029 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-config-volume\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626082 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626082 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcsfs\" (UniqueName: \"kubernetes.io/projected/352184b8-c491-4213-8df2-6b0459566690-kube-api-access-zcsfs\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626234 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352184b8-c491-4213-8df2-6b0459566690-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626234 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626234 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/352184b8-c491-4213-8df2-6b0459566690-tls-assets\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626234 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626234 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/352184b8-c491-4213-8df2-6b0459566690-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626446 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626237 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-web-config\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626446 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626315 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.626446 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.626351 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/352184b8-c491-4213-8df2-6b0459566690-config-out\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.713188 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.713118 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de33ee91-814f-4e24-92a9-20b1ec5751a9" path="/var/lib/kubelet/pods/de33ee91-814f-4e24-92a9-20b1ec5751a9/volumes" Apr 17 20:48:24.727467 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.727447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.727579 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.727476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/352184b8-c491-4213-8df2-6b0459566690-tls-assets\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.727579 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.727502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.727579 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.727516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/352184b8-c491-4213-8df2-6b0459566690-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.727687 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.727647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-web-config\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.727728 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.727693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.727778 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.727724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/352184b8-c491-4213-8df2-6b0459566690-config-out\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.727778 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.727761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.728057 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.728036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/352184b8-c491-4213-8df2-6b0459566690-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.728218 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.728196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/352184b8-c491-4213-8df2-6b0459566690-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.728293 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.728248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-config-volume\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.728293 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.728274 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.728396 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.728301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcsfs\" (UniqueName: \"kubernetes.io/projected/352184b8-c491-4213-8df2-6b0459566690-kube-api-access-zcsfs\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.728396 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.728354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352184b8-c491-4213-8df2-6b0459566690-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.729357 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.729330 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352184b8-c491-4213-8df2-6b0459566690-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.730332 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.730306 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/352184b8-c491-4213-8df2-6b0459566690-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.730417 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.730340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/352184b8-c491-4213-8df2-6b0459566690-config-out\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.730684 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.730650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-web-config\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.731041 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.730653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/352184b8-c491-4213-8df2-6b0459566690-tls-assets\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.731118 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.731057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.731261 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.731240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.731347 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.731327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.731758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.731736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.732345 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.732325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.732671 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.732653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/352184b8-c491-4213-8df2-6b0459566690-config-volume\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.737552 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.737535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcsfs\" (UniqueName: \"kubernetes.io/projected/352184b8-c491-4213-8df2-6b0459566690-kube-api-access-zcsfs\") pod \"alertmanager-main-0\" (UID: \"352184b8-c491-4213-8df2-6b0459566690\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.855491 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.855463 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:48:24.975512 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:24.975486 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:48:24.977500 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:48:24.977473 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod352184b8_c491_4213_8df2_6b0459566690.slice/crio-87fc3f6104bed197831e819b545b3859a6ab7dc7811b787ecbedaa7ce1df02ba WatchSource:0}: Error finding container 87fc3f6104bed197831e819b545b3859a6ab7dc7811b787ecbedaa7ce1df02ba: Status 404 returned error can't find the container with id 87fc3f6104bed197831e819b545b3859a6ab7dc7811b787ecbedaa7ce1df02ba Apr 17 20:48:25.487478 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:25.487446 2572 generic.go:358] "Generic (PLEG): container finished" podID="352184b8-c491-4213-8df2-6b0459566690" containerID="e7af5ed858f99a1c234b8f153121fc1ca2a7e2d8539110a2331b0dbb6e2cd015" exitCode=0 Apr 17 20:48:25.487917 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:25.487535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"352184b8-c491-4213-8df2-6b0459566690","Type":"ContainerDied","Data":"e7af5ed858f99a1c234b8f153121fc1ca2a7e2d8539110a2331b0dbb6e2cd015"} Apr 17 20:48:25.487917 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:25.487577 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"352184b8-c491-4213-8df2-6b0459566690","Type":"ContainerStarted","Data":"87fc3f6104bed197831e819b545b3859a6ab7dc7811b787ecbedaa7ce1df02ba"} Apr 17 20:48:26.495817 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.495777 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"352184b8-c491-4213-8df2-6b0459566690","Type":"ContainerStarted","Data":"f3bade7e4e94c87377dac811b52b05513eb7e31b2202d4f4e830d99cdbea6219"} Apr 17 20:48:26.496242 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.495839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"352184b8-c491-4213-8df2-6b0459566690","Type":"ContainerStarted","Data":"e5d21adb5b4d2d095ca8711702795f73ddef09c26154283aa881ad444db4868f"} Apr 17 20:48:26.496242 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.495853 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"352184b8-c491-4213-8df2-6b0459566690","Type":"ContainerStarted","Data":"c3cd60cf64e60f8b845d9f215c5cec43f2a472e5065d8646117a45f1ffa1025d"} Apr 17 20:48:26.496242 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.495864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"352184b8-c491-4213-8df2-6b0459566690","Type":"ContainerStarted","Data":"86533a34eb4a65d925c8547e8fb464bf5bd1c62abc70ff80ea7565c947a0da41"} Apr 17 20:48:26.496242 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.495877 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"352184b8-c491-4213-8df2-6b0459566690","Type":"ContainerStarted","Data":"c4d75747a761f1ceebb467d73faaefe040411468c83dae6828a99b5279821f27"} Apr 17 20:48:26.496242 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.495888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"352184b8-c491-4213-8df2-6b0459566690","Type":"ContainerStarted","Data":"bb91aed3bff26770eadc2b043e996f509a81b0708565609e017c35108cfa3881"} Apr 17 20:48:26.527246 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.527217 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-f4f46dddc-srlmd"] Apr 17 20:48:26.530969 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.530942 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.533450 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.533424 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 20:48:26.533622 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.533606 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 20:48:26.533950 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.533926 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 20:48:26.534055 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.533963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 20:48:26.534128 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.534067 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 20:48:26.534128 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.534115 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-znn98\"" Apr 17 20:48:26.542630 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.542597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.542726 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.542676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-telemeter-client-tls\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.542726 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.542704 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-federate-client-tls\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.542837 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.542733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.542837 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.542823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-secret-telemeter-client\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.543025 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.543005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-metrics-client-ca\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.543088 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.543044 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-serving-certs-ca-bundle\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.543088 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.543072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmr68\" (UniqueName: \"kubernetes.io/projected/2031370c-07c1-4314-93b9-89184c8ab731-kube-api-access-vmr68\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.548720 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.548663 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.54864284 podStartE2EDuration="2.54864284s" podCreationTimestamp="2026-04-17 20:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:48:26.545442045 +0000 UTC m=+258.270709403" watchObservedRunningTime="2026-04-17 20:48:26.54864284 +0000 UTC m=+258.273910199" Apr 17 20:48:26.550122 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.550100 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 20:48:26.551571 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.551552 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f4f46dddc-srlmd"] Apr 17 20:48:26.643615 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.643578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.643833 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.643628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-telemeter-client-tls\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.643833 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.643650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-federate-client-tls\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.643833 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.643668 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.643833 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.643700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-secret-telemeter-client\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.643833 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.643765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-metrics-client-ca\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.643833 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.643794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-serving-certs-ca-bundle\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.644149 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.643841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmr68\" (UniqueName: \"kubernetes.io/projected/2031370c-07c1-4314-93b9-89184c8ab731-kube-api-access-vmr68\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.644629 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.644601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-serving-certs-ca-bundle\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.644825 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.644735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.645216 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.645194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2031370c-07c1-4314-93b9-89184c8ab731-metrics-client-ca\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.647502 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.646625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.647502 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.647102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-federate-client-tls\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.647502 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.647108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-telemeter-client-tls\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.647502 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.647454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2031370c-07c1-4314-93b9-89184c8ab731-secret-telemeter-client\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.650766 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.650743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmr68\" (UniqueName: \"kubernetes.io/projected/2031370c-07c1-4314-93b9-89184c8ab731-kube-api-access-vmr68\") pod \"telemeter-client-f4f46dddc-srlmd\" (UID: \"2031370c-07c1-4314-93b9-89184c8ab731\") " pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.842538 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.842500 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" Apr 17 20:48:26.975485 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:26.975423 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f4f46dddc-srlmd"] Apr 17 20:48:26.980349 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:48:26.980319 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2031370c_07c1_4314_93b9_89184c8ab731.slice/crio-a8a3612eb6f2944bd533b9c5087c9454ba67a17605322af730220dbc700eefa8 WatchSource:0}: Error finding container a8a3612eb6f2944bd533b9c5087c9454ba67a17605322af730220dbc700eefa8: Status 404 returned error can't find the container with id a8a3612eb6f2944bd533b9c5087c9454ba67a17605322af730220dbc700eefa8 Apr 17 20:48:27.499990 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:27.499953 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" event={"ID":"2031370c-07c1-4314-93b9-89184c8ab731","Type":"ContainerStarted","Data":"a8a3612eb6f2944bd533b9c5087c9454ba67a17605322af730220dbc700eefa8"} Apr 17 20:48:29.507040 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:29.507002 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" event={"ID":"2031370c-07c1-4314-93b9-89184c8ab731","Type":"ContainerStarted","Data":"7b13c590e596f74b358c3328236dd6769ff40bdf01193367fe47af73c13dc1aa"} Apr 17 20:48:29.507408 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:29.507047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" event={"ID":"2031370c-07c1-4314-93b9-89184c8ab731","Type":"ContainerStarted","Data":"f9a0083100d13ec6d1c2f8e0cae010826daf827c1ddaf5590d8042daae57e94b"} Apr 17 20:48:29.507408 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:29.507061 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" event={"ID":"2031370c-07c1-4314-93b9-89184c8ab731","Type":"ContainerStarted","Data":"6c593adb02b4f8045ccea258208936e3cbf843f1111f7be8ba01749f815354f7"} Apr 17 20:48:29.526603 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:29.526559 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-f4f46dddc-srlmd" podStartSLOduration=1.5635212410000001 podStartE2EDuration="3.526544745s" podCreationTimestamp="2026-04-17 20:48:26 +0000 UTC" firstStartedPulling="2026-04-17 20:48:26.982334377 +0000 UTC m=+258.707601726" lastFinishedPulling="2026-04-17 20:48:28.945357892 +0000 UTC m=+260.670625230" observedRunningTime="2026-04-17 20:48:29.525595527 +0000 UTC m=+261.250862885" watchObservedRunningTime="2026-04-17 20:48:29.526544745 +0000 UTC m=+261.251812102" Apr 17 20:48:30.145564 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.145534 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cf7b8b58c-vp5dl"] Apr 17 20:48:30.148773 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.148751 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.157445 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.157424 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf7b8b58c-vp5dl"] Apr 17 20:48:30.173732 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.173703 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-oauth-serving-cert\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.173871 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.173850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-serving-cert\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.173927 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.173898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-service-ca\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.173987 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.173942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-oauth-config\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.173987 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.173969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-console-config\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.174087 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.174062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-trusted-ca-bundle\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.174140 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.174090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmjfj\" (UniqueName: \"kubernetes.io/projected/5df22fb1-baac-494a-928f-a431814818dc-kube-api-access-kmjfj\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.275107 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmjfj\" (UniqueName: \"kubernetes.io/projected/5df22fb1-baac-494a-928f-a431814818dc-kube-api-access-kmjfj\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.275265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-oauth-serving-cert\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.275265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275166 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-serving-cert\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.275265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-service-ca\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.275265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-oauth-config\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.275265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-console-config\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.275497 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-trusted-ca-bundle\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.275904 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-oauth-serving-cert\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.276012 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.275932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-console-config\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.276171 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.276148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-service-ca\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.276256 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.276240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-trusted-ca-bundle\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.277653 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.277633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-serving-cert\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.277745 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.277693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-oauth-config\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.281681 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.281661 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmjfj\" (UniqueName: \"kubernetes.io/projected/5df22fb1-baac-494a-928f-a431814818dc-kube-api-access-kmjfj\") pod \"console-7cf7b8b58c-vp5dl\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.459216 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.459130 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:30.587190 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:30.587160 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf7b8b58c-vp5dl"] Apr 17 20:48:30.590395 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:48:30.590365 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df22fb1_baac_494a_928f_a431814818dc.slice/crio-f1883716dfeeac245eb0bad1f7fa80f9d0f8e1a3b6eb39fabd2dc285e698a955 WatchSource:0}: Error finding container f1883716dfeeac245eb0bad1f7fa80f9d0f8e1a3b6eb39fabd2dc285e698a955: Status 404 returned error can't find the container with id f1883716dfeeac245eb0bad1f7fa80f9d0f8e1a3b6eb39fabd2dc285e698a955 Apr 17 20:48:31.516948 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:31.516908 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf7b8b58c-vp5dl" event={"ID":"5df22fb1-baac-494a-928f-a431814818dc","Type":"ContainerStarted","Data":"6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2"} Apr 17 20:48:31.516948 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:31.516948 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf7b8b58c-vp5dl" event={"ID":"5df22fb1-baac-494a-928f-a431814818dc","Type":"ContainerStarted","Data":"f1883716dfeeac245eb0bad1f7fa80f9d0f8e1a3b6eb39fabd2dc285e698a955"} Apr 17 20:48:31.533168 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:31.533125 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cf7b8b58c-vp5dl" podStartSLOduration=1.5331107510000002 podStartE2EDuration="1.533110751s" podCreationTimestamp="2026-04-17 20:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:48:31.531563773 +0000 UTC m=+263.256831129" watchObservedRunningTime="2026-04-17 20:48:31.533110751 +0000 UTC m=+263.258378107" Apr 17 20:48:40.459873 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:40.459836 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:40.460243 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:40.459932 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:40.464567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:40.464547 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:40.547471 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:40.547445 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:48:40.593858 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:48:40.588247 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85ff8d8c7d-vhgzp"] Apr 17 20:49:05.614578 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.614517 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85ff8d8c7d-vhgzp" podUID="1a20d1f3-710a-487c-82a5-dca6d37f57c1" containerName="console" containerID="cri-o://5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68" gracePeriod=15 Apr 17 20:49:05.851624 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.851602 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85ff8d8c7d-vhgzp_1a20d1f3-710a-487c-82a5-dca6d37f57c1/console/0.log" Apr 17 20:49:05.851749 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.851670 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:49:05.963361 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963268 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-serving-cert\") pod \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " Apr 17 20:49:05.963361 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963340 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-oauth-serving-cert\") pod \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " Apr 17 20:49:05.963361 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963357 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-service-ca\") pod \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " Apr 17 20:49:05.963614 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963396 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-oauth-config\") pod \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " Apr 17 20:49:05.963614 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963415 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdvvm\" (UniqueName: \"kubernetes.io/projected/1a20d1f3-710a-487c-82a5-dca6d37f57c1-kube-api-access-rdvvm\") pod \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " Apr 17 20:49:05.963614 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963440 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-trusted-ca-bundle\") pod \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " Apr 17 20:49:05.963614 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963481 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-config\") pod \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\" (UID: \"1a20d1f3-710a-487c-82a5-dca6d37f57c1\") " Apr 17 20:49:05.963952 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963918 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1a20d1f3-710a-487c-82a5-dca6d37f57c1" (UID: "1a20d1f3-710a-487c-82a5-dca6d37f57c1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:49:05.963952 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.963847 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-service-ca" (OuterVolumeSpecName: "service-ca") pod "1a20d1f3-710a-487c-82a5-dca6d37f57c1" (UID: "1a20d1f3-710a-487c-82a5-dca6d37f57c1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:49:05.964063 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.964029 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-config" (OuterVolumeSpecName: "console-config") pod "1a20d1f3-710a-487c-82a5-dca6d37f57c1" (UID: "1a20d1f3-710a-487c-82a5-dca6d37f57c1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:49:05.964063 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.964023 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1a20d1f3-710a-487c-82a5-dca6d37f57c1" (UID: "1a20d1f3-710a-487c-82a5-dca6d37f57c1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:49:05.965458 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.965433 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1a20d1f3-710a-487c-82a5-dca6d37f57c1" (UID: "1a20d1f3-710a-487c-82a5-dca6d37f57c1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:49:05.965868 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.965841 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1a20d1f3-710a-487c-82a5-dca6d37f57c1" (UID: "1a20d1f3-710a-487c-82a5-dca6d37f57c1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:49:05.965943 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:05.965916 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a20d1f3-710a-487c-82a5-dca6d37f57c1-kube-api-access-rdvvm" (OuterVolumeSpecName: "kube-api-access-rdvvm") pod "1a20d1f3-710a-487c-82a5-dca6d37f57c1" (UID: "1a20d1f3-710a-487c-82a5-dca6d37f57c1"). InnerVolumeSpecName "kube-api-access-rdvvm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:49:06.064486 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.064454 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-oauth-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:49:06.064486 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.064480 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rdvvm\" (UniqueName: \"kubernetes.io/projected/1a20d1f3-710a-487c-82a5-dca6d37f57c1-kube-api-access-rdvvm\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:49:06.064486 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.064492 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-trusted-ca-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:49:06.064689 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.064501 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:49:06.064689 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.064511 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a20d1f3-710a-487c-82a5-dca6d37f57c1-console-serving-cert\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:49:06.064689 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.064519 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-oauth-serving-cert\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:49:06.064689 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.064528 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a20d1f3-710a-487c-82a5-dca6d37f57c1-service-ca\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:49:06.618360 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.618334 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85ff8d8c7d-vhgzp_1a20d1f3-710a-487c-82a5-dca6d37f57c1/console/0.log" Apr 17 20:49:06.618766 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.618370 2572 generic.go:358] "Generic (PLEG): container finished" podID="1a20d1f3-710a-487c-82a5-dca6d37f57c1" containerID="5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68" exitCode=2 Apr 17 20:49:06.618766 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.618433 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85ff8d8c7d-vhgzp" Apr 17 20:49:06.618766 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.618433 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85ff8d8c7d-vhgzp" event={"ID":"1a20d1f3-710a-487c-82a5-dca6d37f57c1","Type":"ContainerDied","Data":"5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68"} Apr 17 20:49:06.618766 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.618541 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85ff8d8c7d-vhgzp" event={"ID":"1a20d1f3-710a-487c-82a5-dca6d37f57c1","Type":"ContainerDied","Data":"8cb8780fa8b870ec48bc7a1d9e62c9f7d3a0b457390119866ce5683b39ef1be5"} Apr 17 20:49:06.618766 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.618561 2572 scope.go:117] "RemoveContainer" containerID="5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68" Apr 17 20:49:06.626885 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.626872 2572 scope.go:117] "RemoveContainer" containerID="5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68" Apr 17 20:49:06.627119 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:49:06.627102 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68\": container with ID starting with 5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68 not found: ID does not exist" containerID="5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68" Apr 17 20:49:06.627164 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.627126 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68"} err="failed to get container status \"5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68\": rpc error: code = NotFound desc = could not find container \"5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68\": container with ID starting with 5d7dda3b0d011db8f8b5ef8bd926e3c7bdc177368b82154d1fc2b2b75157db68 not found: ID does not exist" Apr 17 20:49:06.637451 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.637423 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85ff8d8c7d-vhgzp"] Apr 17 20:49:06.640452 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.640432 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85ff8d8c7d-vhgzp"] Apr 17 20:49:06.712267 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:06.712237 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a20d1f3-710a-487c-82a5-dca6d37f57c1" path="/var/lib/kubelet/pods/1a20d1f3-710a-487c-82a5-dca6d37f57c1/volumes" Apr 17 20:49:08.664006 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:08.663970 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:49:08.664396 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:08.664149 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:49:08.676247 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:49:08.676153 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:50:07.994094 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:07.994010 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c477c97d9-dc8qb"] Apr 17 20:50:07.994495 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:07.994373 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a20d1f3-710a-487c-82a5-dca6d37f57c1" containerName="console" Apr 17 20:50:07.994495 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:07.994387 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a20d1f3-710a-487c-82a5-dca6d37f57c1" containerName="console" Apr 17 20:50:07.994495 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:07.994433 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a20d1f3-710a-487c-82a5-dca6d37f57c1" containerName="console" Apr 17 20:50:07.997353 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:07.997335 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.005621 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.005599 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c477c97d9-dc8qb"] Apr 17 20:50:08.148148 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.148110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r828h\" (UniqueName: \"kubernetes.io/projected/cd91522a-fa69-4614-8392-367a32d29a4d-kube-api-access-r828h\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.148148 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.148151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-serving-cert\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.148376 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.148181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-trusted-ca-bundle\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.148376 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.148261 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-oauth-config\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.148376 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.148284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-service-ca\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.148376 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.148300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-oauth-serving-cert\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.148376 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.148324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-console-config\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.249580 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.249495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-oauth-config\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.249580 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.249532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-service-ca\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.249580 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.249550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-oauth-serving-cert\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.249580 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.249573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-console-config\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.249934 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.249598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r828h\" (UniqueName: \"kubernetes.io/projected/cd91522a-fa69-4614-8392-367a32d29a4d-kube-api-access-r828h\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.249934 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.249620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-serving-cert\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.249934 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.249644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-trusted-ca-bundle\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.250387 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.250365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-console-config\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.250476 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.250420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-oauth-serving-cert\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.250551 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.250528 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-service-ca\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.250642 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.250626 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-trusted-ca-bundle\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.251995 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.251972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-oauth-config\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.252089 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.252074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-serving-cert\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.256887 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.256871 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r828h\" (UniqueName: \"kubernetes.io/projected/cd91522a-fa69-4614-8392-367a32d29a4d-kube-api-access-r828h\") pod \"console-7c477c97d9-dc8qb\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.307669 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.307642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:08.422318 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.422297 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c477c97d9-dc8qb"] Apr 17 20:50:08.424365 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:50:08.424334 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd91522a_fa69_4614_8392_367a32d29a4d.slice/crio-518189afaf1ddd8ec941e46a60e7f2b307bb38d65b51193c91f7597f5cac82fd WatchSource:0}: Error finding container 518189afaf1ddd8ec941e46a60e7f2b307bb38d65b51193c91f7597f5cac82fd: Status 404 returned error can't find the container with id 518189afaf1ddd8ec941e46a60e7f2b307bb38d65b51193c91f7597f5cac82fd Apr 17 20:50:08.426157 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.426143 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:50:08.794333 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.794297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c477c97d9-dc8qb" event={"ID":"cd91522a-fa69-4614-8392-367a32d29a4d","Type":"ContainerStarted","Data":"455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364"} Apr 17 20:50:08.794333 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.794338 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c477c97d9-dc8qb" event={"ID":"cd91522a-fa69-4614-8392-367a32d29a4d","Type":"ContainerStarted","Data":"518189afaf1ddd8ec941e46a60e7f2b307bb38d65b51193c91f7597f5cac82fd"} Apr 17 20:50:08.808485 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:08.808443 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c477c97d9-dc8qb" podStartSLOduration=1.808430729 podStartE2EDuration="1.808430729s" podCreationTimestamp="2026-04-17 20:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:50:08.808381291 +0000 UTC m=+360.533648648" watchObservedRunningTime="2026-04-17 20:50:08.808430729 +0000 UTC m=+360.533698085" Apr 17 20:50:18.308679 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:18.308645 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:18.308679 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:18.308689 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:18.313189 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:18.313168 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:18.826772 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:18.826745 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:50:18.870276 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:18.870233 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cf7b8b58c-vp5dl"] Apr 17 20:50:21.288749 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.288687 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj"] Apr 17 20:50:21.292240 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.292224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.294329 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.294308 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:50:21.295006 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.294978 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:50:21.295006 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.295004 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-74lql\"" Apr 17 20:50:21.300256 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.300232 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj"] Apr 17 20:50:21.353104 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.353069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6k7\" (UniqueName: \"kubernetes.io/projected/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-kube-api-access-wg6k7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.353255 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.353119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.353255 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.353173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.454473 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.454438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.454473 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.454477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.454713 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.454539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6k7\" (UniqueName: \"kubernetes.io/projected/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-kube-api-access-wg6k7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.454930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.454906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.455010 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.454917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.462676 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.462656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6k7\" (UniqueName: \"kubernetes.io/projected/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-kube-api-access-wg6k7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.601837 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.601709 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:21.715730 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.715699 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj"] Apr 17 20:50:21.718915 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:50:21.718888 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a705dcd_bf5f_43c2_819a_4e3b54f93c0b.slice/crio-fbd4845c7a556288e2c7c4fb70b15fa542bb17bde73b8431941b171bec29964a WatchSource:0}: Error finding container fbd4845c7a556288e2c7c4fb70b15fa542bb17bde73b8431941b171bec29964a: Status 404 returned error can't find the container with id fbd4845c7a556288e2c7c4fb70b15fa542bb17bde73b8431941b171bec29964a Apr 17 20:50:21.832275 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:21.832242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" event={"ID":"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b","Type":"ContainerStarted","Data":"fbd4845c7a556288e2c7c4fb70b15fa542bb17bde73b8431941b171bec29964a"} Apr 17 20:50:28.853003 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:28.852967 2572 generic.go:358] "Generic (PLEG): container finished" podID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerID="3b071d4c4f195a21cc7096f18459a1afdddeced4bc38cc7b7bb034a680f8c6e5" exitCode=0 Apr 17 20:50:28.853365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:28.853036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" event={"ID":"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b","Type":"ContainerDied","Data":"3b071d4c4f195a21cc7096f18459a1afdddeced4bc38cc7b7bb034a680f8c6e5"} Apr 17 20:50:35.875415 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:35.875385 2572 generic.go:358] "Generic (PLEG): container finished" podID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerID="7f50b92b6e81ddcca74745ba215fb1162a2b66cd12649c74406c87616b3edea5" exitCode=0 Apr 17 20:50:35.875757 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:35.875430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" event={"ID":"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b","Type":"ContainerDied","Data":"7f50b92b6e81ddcca74745ba215fb1162a2b66cd12649c74406c87616b3edea5"} Apr 17 20:50:43.892379 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:43.892319 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cf7b8b58c-vp5dl" podUID="5df22fb1-baac-494a-928f-a431814818dc" containerName="console" containerID="cri-o://6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2" gracePeriod=15 Apr 17 20:50:43.903366 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:43.903341 2572 generic.go:358] "Generic (PLEG): container finished" podID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerID="639b8d5e75dc9082c80bb90cb451a0677cfe348def840070ae086e80b493c90f" exitCode=0 Apr 17 20:50:43.903453 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:43.903419 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" event={"ID":"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b","Type":"ContainerDied","Data":"639b8d5e75dc9082c80bb90cb451a0677cfe348def840070ae086e80b493c90f"} Apr 17 20:50:44.124406 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.124377 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cf7b8b58c-vp5dl_5df22fb1-baac-494a-928f-a431814818dc/console/0.log" Apr 17 20:50:44.124524 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.124435 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:50:44.253046 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.252957 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmjfj\" (UniqueName: \"kubernetes.io/projected/5df22fb1-baac-494a-928f-a431814818dc-kube-api-access-kmjfj\") pod \"5df22fb1-baac-494a-928f-a431814818dc\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " Apr 17 20:50:44.253046 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.252997 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-oauth-config\") pod \"5df22fb1-baac-494a-928f-a431814818dc\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " Apr 17 20:50:44.253046 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253046 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-serving-cert\") pod \"5df22fb1-baac-494a-928f-a431814818dc\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " Apr 17 20:50:44.253273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253171 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-service-ca\") pod \"5df22fb1-baac-494a-928f-a431814818dc\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " Apr 17 20:50:44.253273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253217 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-oauth-serving-cert\") pod \"5df22fb1-baac-494a-928f-a431814818dc\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " Apr 17 20:50:44.253273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253242 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-console-config\") pod \"5df22fb1-baac-494a-928f-a431814818dc\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " Apr 17 20:50:44.253273 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253267 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-trusted-ca-bundle\") pod \"5df22fb1-baac-494a-928f-a431814818dc\" (UID: \"5df22fb1-baac-494a-928f-a431814818dc\") " Apr 17 20:50:44.253614 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253590 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-service-ca" (OuterVolumeSpecName: "service-ca") pod "5df22fb1-baac-494a-928f-a431814818dc" (UID: "5df22fb1-baac-494a-928f-a431814818dc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:50:44.253679 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253607 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5df22fb1-baac-494a-928f-a431814818dc" (UID: "5df22fb1-baac-494a-928f-a431814818dc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:50:44.253679 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253617 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-console-config" (OuterVolumeSpecName: "console-config") pod "5df22fb1-baac-494a-928f-a431814818dc" (UID: "5df22fb1-baac-494a-928f-a431814818dc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:50:44.253924 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.253901 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5df22fb1-baac-494a-928f-a431814818dc" (UID: "5df22fb1-baac-494a-928f-a431814818dc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:50:44.255183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.255154 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5df22fb1-baac-494a-928f-a431814818dc" (UID: "5df22fb1-baac-494a-928f-a431814818dc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:50:44.255265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.255186 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5df22fb1-baac-494a-928f-a431814818dc" (UID: "5df22fb1-baac-494a-928f-a431814818dc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:50:44.255265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.255209 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df22fb1-baac-494a-928f-a431814818dc-kube-api-access-kmjfj" (OuterVolumeSpecName: "kube-api-access-kmjfj") pod "5df22fb1-baac-494a-928f-a431814818dc" (UID: "5df22fb1-baac-494a-928f-a431814818dc"). InnerVolumeSpecName "kube-api-access-kmjfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:50:44.354883 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.354849 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmjfj\" (UniqueName: \"kubernetes.io/projected/5df22fb1-baac-494a-928f-a431814818dc-kube-api-access-kmjfj\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:44.354883 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.354878 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-oauth-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:44.354883 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.354888 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5df22fb1-baac-494a-928f-a431814818dc-console-serving-cert\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:44.355094 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.354897 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-service-ca\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:44.355094 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.354906 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-oauth-serving-cert\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:44.355094 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.354916 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-console-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:44.355094 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.354925 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5df22fb1-baac-494a-928f-a431814818dc-trusted-ca-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:44.907386 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.907362 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cf7b8b58c-vp5dl_5df22fb1-baac-494a-928f-a431814818dc/console/0.log" Apr 17 20:50:44.907860 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.907397 2572 generic.go:358] "Generic (PLEG): container finished" podID="5df22fb1-baac-494a-928f-a431814818dc" containerID="6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2" exitCode=2 Apr 17 20:50:44.907860 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.907478 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf7b8b58c-vp5dl" Apr 17 20:50:44.907860 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.907488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf7b8b58c-vp5dl" event={"ID":"5df22fb1-baac-494a-928f-a431814818dc","Type":"ContainerDied","Data":"6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2"} Apr 17 20:50:44.907860 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.907525 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf7b8b58c-vp5dl" event={"ID":"5df22fb1-baac-494a-928f-a431814818dc","Type":"ContainerDied","Data":"f1883716dfeeac245eb0bad1f7fa80f9d0f8e1a3b6eb39fabd2dc285e698a955"} Apr 17 20:50:44.907860 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.907539 2572 scope.go:117] "RemoveContainer" containerID="6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2" Apr 17 20:50:44.915502 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.915485 2572 scope.go:117] "RemoveContainer" containerID="6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2" Apr 17 20:50:44.915768 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:50:44.915749 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2\": container with ID starting with 6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2 not found: ID does not exist" containerID="6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2" Apr 17 20:50:44.915916 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.915777 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2"} err="failed to get container status \"6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2\": rpc error: code = NotFound desc = could not find container \"6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2\": container with ID starting with 6a050bdaaa02dabc228739d41fcb6cbae3ee1b292e3ea613cca056251331dbf2 not found: ID does not exist" Apr 17 20:50:44.929133 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.929101 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cf7b8b58c-vp5dl"] Apr 17 20:50:44.930242 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:44.929916 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cf7b8b58c-vp5dl"] Apr 17 20:50:45.026388 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.026369 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:45.159940 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.159911 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-util\") pod \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " Apr 17 20:50:45.160080 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.160008 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-bundle\") pod \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " Apr 17 20:50:45.160080 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.160042 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg6k7\" (UniqueName: \"kubernetes.io/projected/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-kube-api-access-wg6k7\") pod \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\" (UID: \"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b\") " Apr 17 20:50:45.160540 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.160516 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-bundle" (OuterVolumeSpecName: "bundle") pod "1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" (UID: "1a705dcd-bf5f-43c2-819a-4e3b54f93c0b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:50:45.162243 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.162211 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-kube-api-access-wg6k7" (OuterVolumeSpecName: "kube-api-access-wg6k7") pod "1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" (UID: "1a705dcd-bf5f-43c2-819a-4e3b54f93c0b"). InnerVolumeSpecName "kube-api-access-wg6k7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:50:45.163951 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.163931 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-util" (OuterVolumeSpecName: "util") pod "1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" (UID: "1a705dcd-bf5f-43c2-819a-4e3b54f93c0b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:50:45.260780 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.260728 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:45.260780 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.260775 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wg6k7\" (UniqueName: \"kubernetes.io/projected/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-kube-api-access-wg6k7\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:45.260780 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.260785 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a705dcd-bf5f-43c2-819a-4e3b54f93c0b-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:50:45.911735 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.911649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" event={"ID":"1a705dcd-bf5f-43c2-819a-4e3b54f93c0b","Type":"ContainerDied","Data":"fbd4845c7a556288e2c7c4fb70b15fa542bb17bde73b8431941b171bec29964a"} Apr 17 20:50:45.911735 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.911669 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56ggtj" Apr 17 20:50:45.911735 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:45.911683 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbd4845c7a556288e2c7c4fb70b15fa542bb17bde73b8431941b171bec29964a" Apr 17 20:50:46.712531 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:46.712493 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df22fb1-baac-494a-928f-a431814818dc" path="/var/lib/kubelet/pods/5df22fb1-baac-494a-928f-a431814818dc/volumes" Apr 17 20:50:48.593990 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.593951 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc"] Apr 17 20:50:48.594478 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594409 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerName="extract" Apr 17 20:50:48.594478 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594426 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerName="extract" Apr 17 20:50:48.594478 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594453 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerName="pull" Apr 17 20:50:48.594478 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594461 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerName="pull" Apr 17 20:50:48.594478 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594480 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerName="util" Apr 17 20:50:48.594715 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594488 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerName="util" Apr 17 20:50:48.594715 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594503 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5df22fb1-baac-494a-928f-a431814818dc" containerName="console" Apr 17 20:50:48.594715 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594512 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df22fb1-baac-494a-928f-a431814818dc" containerName="console" Apr 17 20:50:48.594715 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594586 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5df22fb1-baac-494a-928f-a431814818dc" containerName="console" Apr 17 20:50:48.594715 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.594618 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a705dcd-bf5f-43c2-819a-4e3b54f93c0b" containerName="extract" Apr 17 20:50:48.597121 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.597103 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" Apr 17 20:50:48.599265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.599247 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-8c4xr\"" Apr 17 20:50:48.599348 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.599288 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:50:48.599402 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.599288 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 20:50:48.606167 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.606146 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc"] Apr 17 20:50:48.690784 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.690743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f217018e-849a-46c4-b69f-cd87d2f8ce48-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mvxgc\" (UID: \"f217018e-849a-46c4-b69f-cd87d2f8ce48\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" Apr 17 20:50:48.690986 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.690895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tt4b\" (UniqueName: \"kubernetes.io/projected/f217018e-849a-46c4-b69f-cd87d2f8ce48-kube-api-access-7tt4b\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mvxgc\" (UID: \"f217018e-849a-46c4-b69f-cd87d2f8ce48\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" Apr 17 20:50:48.791293 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.791265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tt4b\" (UniqueName: \"kubernetes.io/projected/f217018e-849a-46c4-b69f-cd87d2f8ce48-kube-api-access-7tt4b\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mvxgc\" (UID: \"f217018e-849a-46c4-b69f-cd87d2f8ce48\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" Apr 17 20:50:48.791484 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.791328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f217018e-849a-46c4-b69f-cd87d2f8ce48-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mvxgc\" (UID: \"f217018e-849a-46c4-b69f-cd87d2f8ce48\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" Apr 17 20:50:48.791713 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.791693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f217018e-849a-46c4-b69f-cd87d2f8ce48-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mvxgc\" (UID: \"f217018e-849a-46c4-b69f-cd87d2f8ce48\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" Apr 17 20:50:48.798515 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.798487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tt4b\" (UniqueName: \"kubernetes.io/projected/f217018e-849a-46c4-b69f-cd87d2f8ce48-kube-api-access-7tt4b\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mvxgc\" (UID: \"f217018e-849a-46c4-b69f-cd87d2f8ce48\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" Apr 17 20:50:48.907130 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:48.907054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" Apr 17 20:50:49.032359 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:49.032121 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc"] Apr 17 20:50:49.035119 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:50:49.035092 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf217018e_849a_46c4_b69f_cd87d2f8ce48.slice/crio-fe9631bc67bc629a89af2feb9abc354002603966b6e9591efbf9f642a63d3652 WatchSource:0}: Error finding container fe9631bc67bc629a89af2feb9abc354002603966b6e9591efbf9f642a63d3652: Status 404 returned error can't find the container with id fe9631bc67bc629a89af2feb9abc354002603966b6e9591efbf9f642a63d3652 Apr 17 20:50:49.925730 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:49.925689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" event={"ID":"f217018e-849a-46c4-b69f-cd87d2f8ce48","Type":"ContainerStarted","Data":"fe9631bc67bc629a89af2feb9abc354002603966b6e9591efbf9f642a63d3652"} Apr 17 20:50:57.954975 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:57.954932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" event={"ID":"f217018e-849a-46c4-b69f-cd87d2f8ce48","Type":"ContainerStarted","Data":"5f0496504a7ecb3e58be953e43f817915b24481956fe6b97b33f38f16b8238d4"} Apr 17 20:50:57.973140 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:57.973089 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mvxgc" podStartSLOduration=1.969358451 podStartE2EDuration="9.973076333s" podCreationTimestamp="2026-04-17 20:50:48 +0000 UTC" firstStartedPulling="2026-04-17 20:50:49.037553843 +0000 UTC m=+400.762821192" lastFinishedPulling="2026-04-17 20:50:57.041271739 +0000 UTC m=+408.766539074" observedRunningTime="2026-04-17 20:50:57.971525859 +0000 UTC m=+409.696793216" watchObservedRunningTime="2026-04-17 20:50:57.973076333 +0000 UTC m=+409.698343690" Apr 17 20:50:59.666751 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.666717 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc"] Apr 17 20:50:59.670270 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.670252 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.672053 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.672027 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:50:59.672541 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.672525 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-74lql\"" Apr 17 20:50:59.672615 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.672557 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:50:59.676097 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.676059 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc"] Apr 17 20:50:59.786592 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.786560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.786740 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.786608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbnsf\" (UniqueName: \"kubernetes.io/projected/89e95908-421c-4e11-9f31-6988796bfe8c-kube-api-access-fbnsf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.786740 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.786688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.887183 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.887145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.887346 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.887201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbnsf\" (UniqueName: \"kubernetes.io/projected/89e95908-421c-4e11-9f31-6988796bfe8c-kube-api-access-fbnsf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.887346 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.887237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.887584 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.887561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.887584 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.887575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.894231 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.894208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbnsf\" (UniqueName: \"kubernetes.io/projected/89e95908-421c-4e11-9f31-6988796bfe8c-kube-api-access-fbnsf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:50:59.979705 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:50:59.979629 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:51:00.095537 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.095515 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc"] Apr 17 20:51:00.097397 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:51:00.097373 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e95908_421c_4e11_9f31_6988796bfe8c.slice/crio-b33e5c7c56ed4d9a71b1782f12db9ab26810f2a1b3356f80886422f8d86c1203 WatchSource:0}: Error finding container b33e5c7c56ed4d9a71b1782f12db9ab26810f2a1b3356f80886422f8d86c1203: Status 404 returned error can't find the container with id b33e5c7c56ed4d9a71b1782f12db9ab26810f2a1b3356f80886422f8d86c1203 Apr 17 20:51:00.822230 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.822197 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-hbvm2"] Apr 17 20:51:00.825453 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.825435 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:00.827459 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.827440 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 20:51:00.834203 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.834177 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-74m5w\"" Apr 17 20:51:00.834444 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.834390 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 20:51:00.837189 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.837166 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-hbvm2"] Apr 17 20:51:00.895754 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.895720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dglcp\" (UniqueName: \"kubernetes.io/projected/d247e651-fe24-4236-895a-d0ee408ba4df-kube-api-access-dglcp\") pod \"cert-manager-webhook-597b96b99b-hbvm2\" (UID: \"d247e651-fe24-4236-895a-d0ee408ba4df\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:00.895914 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.895823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d247e651-fe24-4236-895a-d0ee408ba4df-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-hbvm2\" (UID: \"d247e651-fe24-4236-895a-d0ee408ba4df\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:00.965452 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.965423 2572 generic.go:358] "Generic (PLEG): container finished" podID="89e95908-421c-4e11-9f31-6988796bfe8c" containerID="9f7aabc27a69888d680706d4cf891431a87f05f3413ca1837490af4cf0ca54f5" exitCode=0 Apr 17 20:51:00.965563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.965515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" event={"ID":"89e95908-421c-4e11-9f31-6988796bfe8c","Type":"ContainerDied","Data":"9f7aabc27a69888d680706d4cf891431a87f05f3413ca1837490af4cf0ca54f5"} Apr 17 20:51:00.965563 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.965548 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" event={"ID":"89e95908-421c-4e11-9f31-6988796bfe8c","Type":"ContainerStarted","Data":"b33e5c7c56ed4d9a71b1782f12db9ab26810f2a1b3356f80886422f8d86c1203"} Apr 17 20:51:00.996748 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.996726 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dglcp\" (UniqueName: \"kubernetes.io/projected/d247e651-fe24-4236-895a-d0ee408ba4df-kube-api-access-dglcp\") pod \"cert-manager-webhook-597b96b99b-hbvm2\" (UID: \"d247e651-fe24-4236-895a-d0ee408ba4df\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:00.996870 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:00.996767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d247e651-fe24-4236-895a-d0ee408ba4df-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-hbvm2\" (UID: \"d247e651-fe24-4236-895a-d0ee408ba4df\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:01.004226 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:01.004197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dglcp\" (UniqueName: \"kubernetes.io/projected/d247e651-fe24-4236-895a-d0ee408ba4df-kube-api-access-dglcp\") pod \"cert-manager-webhook-597b96b99b-hbvm2\" (UID: \"d247e651-fe24-4236-895a-d0ee408ba4df\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:01.004311 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:01.004203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d247e651-fe24-4236-895a-d0ee408ba4df-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-hbvm2\" (UID: \"d247e651-fe24-4236-895a-d0ee408ba4df\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:01.148444 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:01.148361 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:01.263780 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:01.263750 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-hbvm2"] Apr 17 20:51:01.266615 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:51:01.266585 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd247e651_fe24_4236_895a_d0ee408ba4df.slice/crio-06a50c0d40d03709342d4f1b2f3376d4192083594380e475c9afed0a7a1f9820 WatchSource:0}: Error finding container 06a50c0d40d03709342d4f1b2f3376d4192083594380e475c9afed0a7a1f9820: Status 404 returned error can't find the container with id 06a50c0d40d03709342d4f1b2f3376d4192083594380e475c9afed0a7a1f9820 Apr 17 20:51:01.970079 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:01.970040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" event={"ID":"d247e651-fe24-4236-895a-d0ee408ba4df","Type":"ContainerStarted","Data":"06a50c0d40d03709342d4f1b2f3376d4192083594380e475c9afed0a7a1f9820"} Apr 17 20:51:02.683414 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.683374 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9wqgg"] Apr 17 20:51:02.694168 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.694137 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9wqgg"] Apr 17 20:51:02.694347 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.694270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" Apr 17 20:51:02.696451 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.696419 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-r5rrb\"" Apr 17 20:51:02.813946 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.813911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42379685-30dd-47a3-8dfc-3e0148a7e6ee-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9wqgg\" (UID: \"42379685-30dd-47a3-8dfc-3e0148a7e6ee\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" Apr 17 20:51:02.814126 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.813988 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z6x6\" (UniqueName: \"kubernetes.io/projected/42379685-30dd-47a3-8dfc-3e0148a7e6ee-kube-api-access-2z6x6\") pod \"cert-manager-cainjector-8966b78d4-9wqgg\" (UID: \"42379685-30dd-47a3-8dfc-3e0148a7e6ee\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" Apr 17 20:51:02.915500 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.915455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42379685-30dd-47a3-8dfc-3e0148a7e6ee-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9wqgg\" (UID: \"42379685-30dd-47a3-8dfc-3e0148a7e6ee\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" Apr 17 20:51:02.915674 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.915506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z6x6\" (UniqueName: \"kubernetes.io/projected/42379685-30dd-47a3-8dfc-3e0148a7e6ee-kube-api-access-2z6x6\") pod \"cert-manager-cainjector-8966b78d4-9wqgg\" (UID: \"42379685-30dd-47a3-8dfc-3e0148a7e6ee\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" Apr 17 20:51:02.923198 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.923173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42379685-30dd-47a3-8dfc-3e0148a7e6ee-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9wqgg\" (UID: \"42379685-30dd-47a3-8dfc-3e0148a7e6ee\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" Apr 17 20:51:02.923338 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:02.923247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z6x6\" (UniqueName: \"kubernetes.io/projected/42379685-30dd-47a3-8dfc-3e0148a7e6ee-kube-api-access-2z6x6\") pod \"cert-manager-cainjector-8966b78d4-9wqgg\" (UID: \"42379685-30dd-47a3-8dfc-3e0148a7e6ee\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" Apr 17 20:51:03.005841 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:03.005814 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" Apr 17 20:51:03.141321 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:03.141291 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9wqgg"] Apr 17 20:51:03.151152 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:51:03.151106 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42379685_30dd_47a3_8dfc_3e0148a7e6ee.slice/crio-a0a766d36bd9ef8d18ae9d9e0f96b35bab62f46c08d0445ca62077a90a143fe6 WatchSource:0}: Error finding container a0a766d36bd9ef8d18ae9d9e0f96b35bab62f46c08d0445ca62077a90a143fe6: Status 404 returned error can't find the container with id a0a766d36bd9ef8d18ae9d9e0f96b35bab62f46c08d0445ca62077a90a143fe6 Apr 17 20:51:03.979412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:03.979373 2572 generic.go:358] "Generic (PLEG): container finished" podID="89e95908-421c-4e11-9f31-6988796bfe8c" containerID="88fd29b094d0b378a5afdfb220a4363fee2e9d36faa6b464670df6480cd1ccab" exitCode=0 Apr 17 20:51:03.979595 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:03.979481 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" event={"ID":"89e95908-421c-4e11-9f31-6988796bfe8c","Type":"ContainerDied","Data":"88fd29b094d0b378a5afdfb220a4363fee2e9d36faa6b464670df6480cd1ccab"} Apr 17 20:51:03.980724 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:03.980690 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" event={"ID":"42379685-30dd-47a3-8dfc-3e0148a7e6ee","Type":"ContainerStarted","Data":"a0a766d36bd9ef8d18ae9d9e0f96b35bab62f46c08d0445ca62077a90a143fe6"} Apr 17 20:51:04.986976 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:04.986930 2572 generic.go:358] "Generic (PLEG): container finished" podID="89e95908-421c-4e11-9f31-6988796bfe8c" containerID="0c0169f12fe86dcc669aaafdb91f6fba24415be2e2b2b8c806ee0db74f6d0a77" exitCode=0 Apr 17 20:51:04.987358 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:04.987045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" event={"ID":"89e95908-421c-4e11-9f31-6988796bfe8c","Type":"ContainerDied","Data":"0c0169f12fe86dcc669aaafdb91f6fba24415be2e2b2b8c806ee0db74f6d0a77"} Apr 17 20:51:05.992315 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:05.992279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" event={"ID":"42379685-30dd-47a3-8dfc-3e0148a7e6ee","Type":"ContainerStarted","Data":"40f2d0f5ba36809512c72d2af8914c3513724680c9804d01a3fdabc4c40b76a5"} Apr 17 20:51:05.993565 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:05.993543 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" event={"ID":"d247e651-fe24-4236-895a-d0ee408ba4df","Type":"ContainerStarted","Data":"424cb60adf9370eb0c95b419e8e982f887df54be6d96973deec734be9feac8bb"} Apr 17 20:51:05.993651 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:05.993640 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:06.004966 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.004874 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-9wqgg" podStartSLOduration=1.667171905 podStartE2EDuration="4.004855242s" podCreationTimestamp="2026-04-17 20:51:02 +0000 UTC" firstStartedPulling="2026-04-17 20:51:03.153410855 +0000 UTC m=+414.878678194" lastFinishedPulling="2026-04-17 20:51:05.491094196 +0000 UTC m=+417.216361531" observedRunningTime="2026-04-17 20:51:06.004142402 +0000 UTC m=+417.729409760" watchObservedRunningTime="2026-04-17 20:51:06.004855242 +0000 UTC m=+417.730122600" Apr 17 20:51:06.017330 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.017268 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" podStartSLOduration=1.800720351 podStartE2EDuration="6.017252226s" podCreationTimestamp="2026-04-17 20:51:00 +0000 UTC" firstStartedPulling="2026-04-17 20:51:01.268364247 +0000 UTC m=+412.993631585" lastFinishedPulling="2026-04-17 20:51:05.484896121 +0000 UTC m=+417.210163460" observedRunningTime="2026-04-17 20:51:06.016650556 +0000 UTC m=+417.741917940" watchObservedRunningTime="2026-04-17 20:51:06.017252226 +0000 UTC m=+417.742519583" Apr 17 20:51:06.123179 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.123156 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:51:06.244691 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.244616 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-util\") pod \"89e95908-421c-4e11-9f31-6988796bfe8c\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " Apr 17 20:51:06.244691 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.244658 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbnsf\" (UniqueName: \"kubernetes.io/projected/89e95908-421c-4e11-9f31-6988796bfe8c-kube-api-access-fbnsf\") pod \"89e95908-421c-4e11-9f31-6988796bfe8c\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " Apr 17 20:51:06.244931 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.244697 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-bundle\") pod \"89e95908-421c-4e11-9f31-6988796bfe8c\" (UID: \"89e95908-421c-4e11-9f31-6988796bfe8c\") " Apr 17 20:51:06.245224 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.245182 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-bundle" (OuterVolumeSpecName: "bundle") pod "89e95908-421c-4e11-9f31-6988796bfe8c" (UID: "89e95908-421c-4e11-9f31-6988796bfe8c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:51:06.247056 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.247027 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e95908-421c-4e11-9f31-6988796bfe8c-kube-api-access-fbnsf" (OuterVolumeSpecName: "kube-api-access-fbnsf") pod "89e95908-421c-4e11-9f31-6988796bfe8c" (UID: "89e95908-421c-4e11-9f31-6988796bfe8c"). InnerVolumeSpecName "kube-api-access-fbnsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:51:06.250934 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.250905 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-util" (OuterVolumeSpecName: "util") pod "89e95908-421c-4e11-9f31-6988796bfe8c" (UID: "89e95908-421c-4e11-9f31-6988796bfe8c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:51:06.345984 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.345951 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbnsf\" (UniqueName: \"kubernetes.io/projected/89e95908-421c-4e11-9f31-6988796bfe8c-kube-api-access-fbnsf\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:06.345984 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.345981 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:06.345984 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.345992 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89e95908-421c-4e11-9f31-6988796bfe8c-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:06.999710 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.999624 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" event={"ID":"89e95908-421c-4e11-9f31-6988796bfe8c","Type":"ContainerDied","Data":"b33e5c7c56ed4d9a71b1782f12db9ab26810f2a1b3356f80886422f8d86c1203"} Apr 17 20:51:06.999710 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.999671 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33e5c7c56ed4d9a71b1782f12db9ab26810f2a1b3356f80886422f8d86c1203" Apr 17 20:51:06.999710 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:06.999640 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhkbc" Apr 17 20:51:12.002420 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:12.002391 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-hbvm2" Apr 17 20:51:18.494089 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.494052 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv"] Apr 17 20:51:18.494539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.494418 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89e95908-421c-4e11-9f31-6988796bfe8c" containerName="extract" Apr 17 20:51:18.494539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.494428 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e95908-421c-4e11-9f31-6988796bfe8c" containerName="extract" Apr 17 20:51:18.494539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.494450 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89e95908-421c-4e11-9f31-6988796bfe8c" containerName="util" Apr 17 20:51:18.494539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.494456 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e95908-421c-4e11-9f31-6988796bfe8c" containerName="util" Apr 17 20:51:18.494539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.494463 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89e95908-421c-4e11-9f31-6988796bfe8c" containerName="pull" Apr 17 20:51:18.494539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.494468 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e95908-421c-4e11-9f31-6988796bfe8c" containerName="pull" Apr 17 20:51:18.494539 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.494530 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="89e95908-421c-4e11-9f31-6988796bfe8c" containerName="extract" Apr 17 20:51:18.502365 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.502341 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.503360 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.503334 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv"] Apr 17 20:51:18.504236 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.504215 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-74lql\"" Apr 17 20:51:18.504320 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.504249 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:51:18.504852 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.504827 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:51:18.540711 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.540688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.540867 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.540744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwwgv\" (UniqueName: \"kubernetes.io/projected/07ca9ade-59f6-42a0-976a-4a8486d4bd78-kube-api-access-gwwgv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.540867 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.540846 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.641348 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.641316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwwgv\" (UniqueName: \"kubernetes.io/projected/07ca9ade-59f6-42a0-976a-4a8486d4bd78-kube-api-access-gwwgv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.641526 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.641357 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.641601 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.641574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.641724 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.641702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.641942 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.641920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.648375 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.648355 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwwgv\" (UniqueName: \"kubernetes.io/projected/07ca9ade-59f6-42a0-976a-4a8486d4bd78-kube-api-access-gwwgv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.812192 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.812165 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:18.932702 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:18.932638 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv"] Apr 17 20:51:18.935244 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:51:18.935217 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07ca9ade_59f6_42a0_976a_4a8486d4bd78.slice/crio-cc8ff8063935813368e83ad16959820bb6f31fe5a653f67cb87df7c4cf854804 WatchSource:0}: Error finding container cc8ff8063935813368e83ad16959820bb6f31fe5a653f67cb87df7c4cf854804: Status 404 returned error can't find the container with id cc8ff8063935813368e83ad16959820bb6f31fe5a653f67cb87df7c4cf854804 Apr 17 20:51:19.039634 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.039608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" event={"ID":"07ca9ade-59f6-42a0-976a-4a8486d4bd78","Type":"ContainerStarted","Data":"e87f6df6d452978192869ea69dcbb2e3d7d32b1a5f95981d7f508422b2b7cd59"} Apr 17 20:51:19.039764 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.039641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" event={"ID":"07ca9ade-59f6-42a0-976a-4a8486d4bd78","Type":"ContainerStarted","Data":"cc8ff8063935813368e83ad16959820bb6f31fe5a653f67cb87df7c4cf854804"} Apr 17 20:51:19.047846 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.047797 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-w2v2l"] Apr 17 20:51:19.051125 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.051109 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-w2v2l" Apr 17 20:51:19.052925 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.052908 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-t4wvd\"" Apr 17 20:51:19.060634 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.060606 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-w2v2l"] Apr 17 20:51:19.146217 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.146191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl9nh\" (UniqueName: \"kubernetes.io/projected/3b8828bd-8d25-464a-94d6-7d01bfa9b073-kube-api-access-kl9nh\") pod \"cert-manager-759f64656b-w2v2l\" (UID: \"3b8828bd-8d25-464a-94d6-7d01bfa9b073\") " pod="cert-manager/cert-manager-759f64656b-w2v2l" Apr 17 20:51:19.146350 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.146270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b8828bd-8d25-464a-94d6-7d01bfa9b073-bound-sa-token\") pod \"cert-manager-759f64656b-w2v2l\" (UID: \"3b8828bd-8d25-464a-94d6-7d01bfa9b073\") " pod="cert-manager/cert-manager-759f64656b-w2v2l" Apr 17 20:51:19.247066 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.247038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b8828bd-8d25-464a-94d6-7d01bfa9b073-bound-sa-token\") pod \"cert-manager-759f64656b-w2v2l\" (UID: \"3b8828bd-8d25-464a-94d6-7d01bfa9b073\") " pod="cert-manager/cert-manager-759f64656b-w2v2l" Apr 17 20:51:19.247226 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.247076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl9nh\" (UniqueName: \"kubernetes.io/projected/3b8828bd-8d25-464a-94d6-7d01bfa9b073-kube-api-access-kl9nh\") pod \"cert-manager-759f64656b-w2v2l\" (UID: \"3b8828bd-8d25-464a-94d6-7d01bfa9b073\") " pod="cert-manager/cert-manager-759f64656b-w2v2l" Apr 17 20:51:19.254074 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.254053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b8828bd-8d25-464a-94d6-7d01bfa9b073-bound-sa-token\") pod \"cert-manager-759f64656b-w2v2l\" (UID: \"3b8828bd-8d25-464a-94d6-7d01bfa9b073\") " pod="cert-manager/cert-manager-759f64656b-w2v2l" Apr 17 20:51:19.254189 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.254173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl9nh\" (UniqueName: \"kubernetes.io/projected/3b8828bd-8d25-464a-94d6-7d01bfa9b073-kube-api-access-kl9nh\") pod \"cert-manager-759f64656b-w2v2l\" (UID: \"3b8828bd-8d25-464a-94d6-7d01bfa9b073\") " pod="cert-manager/cert-manager-759f64656b-w2v2l" Apr 17 20:51:19.381108 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.381013 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-w2v2l" Apr 17 20:51:19.496156 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:19.495995 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-w2v2l"] Apr 17 20:51:19.498758 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:51:19.498730 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b8828bd_8d25_464a_94d6_7d01bfa9b073.slice/crio-114e38f33b52984c4385f32b66f96062a0f31ba257b6702d9d784f4687cdd145 WatchSource:0}: Error finding container 114e38f33b52984c4385f32b66f96062a0f31ba257b6702d9d784f4687cdd145: Status 404 returned error can't find the container with id 114e38f33b52984c4385f32b66f96062a0f31ba257b6702d9d784f4687cdd145 Apr 17 20:51:20.044262 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:20.044218 2572 generic.go:358] "Generic (PLEG): container finished" podID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerID="e87f6df6d452978192869ea69dcbb2e3d7d32b1a5f95981d7f508422b2b7cd59" exitCode=0 Apr 17 20:51:20.044262 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:20.044256 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" event={"ID":"07ca9ade-59f6-42a0-976a-4a8486d4bd78","Type":"ContainerDied","Data":"e87f6df6d452978192869ea69dcbb2e3d7d32b1a5f95981d7f508422b2b7cd59"} Apr 17 20:51:20.045727 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:20.045705 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-w2v2l" event={"ID":"3b8828bd-8d25-464a-94d6-7d01bfa9b073","Type":"ContainerStarted","Data":"d9a15379247c15db3c88139c909acc590979d3bb068152f473905e255adf0b40"} Apr 17 20:51:20.045823 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:20.045735 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-w2v2l" event={"ID":"3b8828bd-8d25-464a-94d6-7d01bfa9b073","Type":"ContainerStarted","Data":"114e38f33b52984c4385f32b66f96062a0f31ba257b6702d9d784f4687cdd145"} Apr 17 20:51:20.072567 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:20.072518 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-w2v2l" podStartSLOduration=1.072505767 podStartE2EDuration="1.072505767s" podCreationTimestamp="2026-04-17 20:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:51:20.07027207 +0000 UTC m=+431.795539427" watchObservedRunningTime="2026-04-17 20:51:20.072505767 +0000 UTC m=+431.797773124" Apr 17 20:51:21.051142 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:21.051107 2572 generic.go:358] "Generic (PLEG): container finished" podID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerID="7e50d5dd9c227df99178db620277d5c710f37ca401cffaed0ca94d3f76193756" exitCode=0 Apr 17 20:51:21.051568 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:21.051197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" event={"ID":"07ca9ade-59f6-42a0-976a-4a8486d4bd78","Type":"ContainerDied","Data":"7e50d5dd9c227df99178db620277d5c710f37ca401cffaed0ca94d3f76193756"} Apr 17 20:51:22.059160 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:22.059120 2572 generic.go:358] "Generic (PLEG): container finished" podID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerID="59ad6645c8e31e9688875a4659210d906adf64ccf86dcce99f064c61f955bb42" exitCode=0 Apr 17 20:51:22.059530 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:22.059244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" event={"ID":"07ca9ade-59f6-42a0-976a-4a8486d4bd78","Type":"ContainerDied","Data":"59ad6645c8e31e9688875a4659210d906adf64ccf86dcce99f064c61f955bb42"} Apr 17 20:51:23.189436 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.189413 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:23.275912 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.275874 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwwgv\" (UniqueName: \"kubernetes.io/projected/07ca9ade-59f6-42a0-976a-4a8486d4bd78-kube-api-access-gwwgv\") pod \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " Apr 17 20:51:23.276065 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.275936 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-bundle\") pod \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " Apr 17 20:51:23.276065 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.275984 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-util\") pod \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\" (UID: \"07ca9ade-59f6-42a0-976a-4a8486d4bd78\") " Apr 17 20:51:23.276637 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.276613 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-bundle" (OuterVolumeSpecName: "bundle") pod "07ca9ade-59f6-42a0-976a-4a8486d4bd78" (UID: "07ca9ade-59f6-42a0-976a-4a8486d4bd78"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:51:23.277914 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.277889 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ca9ade-59f6-42a0-976a-4a8486d4bd78-kube-api-access-gwwgv" (OuterVolumeSpecName: "kube-api-access-gwwgv") pod "07ca9ade-59f6-42a0-976a-4a8486d4bd78" (UID: "07ca9ade-59f6-42a0-976a-4a8486d4bd78"). InnerVolumeSpecName "kube-api-access-gwwgv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:51:23.281982 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.281961 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-util" (OuterVolumeSpecName: "util") pod "07ca9ade-59f6-42a0-976a-4a8486d4bd78" (UID: "07ca9ade-59f6-42a0-976a-4a8486d4bd78"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:51:23.376882 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.376796 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwwgv\" (UniqueName: \"kubernetes.io/projected/07ca9ade-59f6-42a0-976a-4a8486d4bd78-kube-api-access-gwwgv\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:23.376882 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.376841 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:23.376882 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:23.376851 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07ca9ade-59f6-42a0-976a-4a8486d4bd78-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:24.068103 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:24.068077 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" Apr 17 20:51:24.068275 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:24.068067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5smlbv" event={"ID":"07ca9ade-59f6-42a0-976a-4a8486d4bd78","Type":"ContainerDied","Data":"cc8ff8063935813368e83ad16959820bb6f31fe5a653f67cb87df7c4cf854804"} Apr 17 20:51:24.068275 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:24.068189 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc8ff8063935813368e83ad16959820bb6f31fe5a653f67cb87df7c4cf854804" Apr 17 20:51:35.086177 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.086145 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv"] Apr 17 20:51:35.086787 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.086660 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerName="util" Apr 17 20:51:35.086787 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.086681 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerName="util" Apr 17 20:51:35.086787 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.086694 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerName="extract" Apr 17 20:51:35.086787 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.086702 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerName="extract" Apr 17 20:51:35.086787 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.086745 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerName="pull" Apr 17 20:51:35.086787 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.086753 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerName="pull" Apr 17 20:51:35.087154 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.086846 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="07ca9ade-59f6-42a0-976a-4a8486d4bd78" containerName="extract" Apr 17 20:51:35.092998 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.092978 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.095044 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.095021 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-qjnvg\"" Apr 17 20:51:35.095210 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.095191 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:51:35.095259 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.095230 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:51:35.095302 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.095277 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:51:35.095302 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.095235 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:51:35.106052 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.106027 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv"] Apr 17 20:51:35.163955 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.163920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9167dae3-8f4c-426c-b9f7-095acc43072f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.164116 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.163964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9167dae3-8f4c-426c-b9f7-095acc43072f-webhook-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.164116 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.164042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqsv\" (UniqueName: \"kubernetes.io/projected/9167dae3-8f4c-426c-b9f7-095acc43072f-kube-api-access-jtqsv\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.265182 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.265143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqsv\" (UniqueName: \"kubernetes.io/projected/9167dae3-8f4c-426c-b9f7-095acc43072f-kube-api-access-jtqsv\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.265358 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.265219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9167dae3-8f4c-426c-b9f7-095acc43072f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.265358 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.265292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9167dae3-8f4c-426c-b9f7-095acc43072f-webhook-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.267776 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.267754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9167dae3-8f4c-426c-b9f7-095acc43072f-webhook-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.267940 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.267916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9167dae3-8f4c-426c-b9f7-095acc43072f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.275953 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.275924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqsv\" (UniqueName: \"kubernetes.io/projected/9167dae3-8f4c-426c-b9f7-095acc43072f-kube-api-access-jtqsv\") pod \"opendatahub-operator-controller-manager-6dc4849f89-8z4fv\" (UID: \"9167dae3-8f4c-426c-b9f7-095acc43072f\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.403616 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.403535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:35.532576 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.532552 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv"] Apr 17 20:51:35.534687 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:51:35.534660 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9167dae3_8f4c_426c_b9f7_095acc43072f.slice/crio-4c06d2edad6f578353f02be3d9b4a9129e65eee3aeee89bf9e25b9b2dc6e9ec2 WatchSource:0}: Error finding container 4c06d2edad6f578353f02be3d9b4a9129e65eee3aeee89bf9e25b9b2dc6e9ec2: Status 404 returned error can't find the container with id 4c06d2edad6f578353f02be3d9b4a9129e65eee3aeee89bf9e25b9b2dc6e9ec2 Apr 17 20:51:35.716584 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.716507 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7"] Apr 17 20:51:35.722666 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.722642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.724930 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.724892 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:51:35.725040 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.724905 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:51:35.725040 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.724980 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-74lql\"" Apr 17 20:51:35.725709 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.725685 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7"] Apr 17 20:51:35.768702 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.768674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.768887 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.768710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtbp\" (UniqueName: \"kubernetes.io/projected/bb3f2bf9-f138-412f-8243-3a5b3618cd37-kube-api-access-6gtbp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.768887 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.768734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.870187 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.870157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.870187 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.870193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtbp\" (UniqueName: \"kubernetes.io/projected/bb3f2bf9-f138-412f-8243-3a5b3618cd37-kube-api-access-6gtbp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.870413 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.870286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.870527 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.870505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.870598 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.870555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:35.881235 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:35.881213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtbp\" (UniqueName: \"kubernetes.io/projected/bb3f2bf9-f138-412f-8243-3a5b3618cd37-kube-api-access-6gtbp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:36.032869 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:36.032825 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:36.114257 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:36.114198 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" event={"ID":"9167dae3-8f4c-426c-b9f7-095acc43072f","Type":"ContainerStarted","Data":"4c06d2edad6f578353f02be3d9b4a9129e65eee3aeee89bf9e25b9b2dc6e9ec2"} Apr 17 20:51:36.189444 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:36.188022 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7"] Apr 17 20:51:37.119720 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:37.119692 2572 generic.go:358] "Generic (PLEG): container finished" podID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerID="67ff5d1db93510f56692988391daedcb75f22f087fe720518e3c8155ef5b54e0" exitCode=0 Apr 17 20:51:37.120151 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:37.119741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" event={"ID":"bb3f2bf9-f138-412f-8243-3a5b3618cd37","Type":"ContainerDied","Data":"67ff5d1db93510f56692988391daedcb75f22f087fe720518e3c8155ef5b54e0"} Apr 17 20:51:37.120151 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:37.119778 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" event={"ID":"bb3f2bf9-f138-412f-8243-3a5b3618cd37","Type":"ContainerStarted","Data":"db338e551d9008c97b34140e761a362e7494de925927ee563e6f08e14eb27974"} Apr 17 20:51:39.127565 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:39.127534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" event={"ID":"9167dae3-8f4c-426c-b9f7-095acc43072f","Type":"ContainerStarted","Data":"b7417848ffff9f9003b907ba2ccaf5005dd7c60d64bb196131213298b489f71b"} Apr 17 20:51:39.128010 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:39.127657 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:39.129121 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:39.129095 2572 generic.go:358] "Generic (PLEG): container finished" podID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerID="346231218c0a330e9d9cfff3e07dd779a732d3fa01b067438a8785b02792eb72" exitCode=0 Apr 17 20:51:39.129237 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:39.129175 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" event={"ID":"bb3f2bf9-f138-412f-8243-3a5b3618cd37","Type":"ContainerDied","Data":"346231218c0a330e9d9cfff3e07dd779a732d3fa01b067438a8785b02792eb72"} Apr 17 20:51:39.145412 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:39.145367 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" podStartSLOduration=1.282622819 podStartE2EDuration="4.145353902s" podCreationTimestamp="2026-04-17 20:51:35 +0000 UTC" firstStartedPulling="2026-04-17 20:51:35.536304645 +0000 UTC m=+447.261571980" lastFinishedPulling="2026-04-17 20:51:38.399035713 +0000 UTC m=+450.124303063" observedRunningTime="2026-04-17 20:51:39.143194438 +0000 UTC m=+450.868461794" watchObservedRunningTime="2026-04-17 20:51:39.145353902 +0000 UTC m=+450.870621258" Apr 17 20:51:40.135230 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:40.135194 2572 generic.go:358] "Generic (PLEG): container finished" podID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerID="b443ec37c1b05c81dedc26ebad8140bdf9aa2ddac145e8d0ada278e3866bece5" exitCode=0 Apr 17 20:51:40.135582 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:40.135306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" event={"ID":"bb3f2bf9-f138-412f-8243-3a5b3618cd37","Type":"ContainerDied","Data":"b443ec37c1b05c81dedc26ebad8140bdf9aa2ddac145e8d0ada278e3866bece5"} Apr 17 20:51:41.270313 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.270286 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:41.318033 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.318006 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-util\") pod \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " Apr 17 20:51:41.318142 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.318104 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-bundle\") pod \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " Apr 17 20:51:41.318142 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.318134 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gtbp\" (UniqueName: \"kubernetes.io/projected/bb3f2bf9-f138-412f-8243-3a5b3618cd37-kube-api-access-6gtbp\") pod \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\" (UID: \"bb3f2bf9-f138-412f-8243-3a5b3618cd37\") " Apr 17 20:51:41.318963 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.318940 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-bundle" (OuterVolumeSpecName: "bundle") pod "bb3f2bf9-f138-412f-8243-3a5b3618cd37" (UID: "bb3f2bf9-f138-412f-8243-3a5b3618cd37"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:51:41.320072 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.320053 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3f2bf9-f138-412f-8243-3a5b3618cd37-kube-api-access-6gtbp" (OuterVolumeSpecName: "kube-api-access-6gtbp") pod "bb3f2bf9-f138-412f-8243-3a5b3618cd37" (UID: "bb3f2bf9-f138-412f-8243-3a5b3618cd37"). InnerVolumeSpecName "kube-api-access-6gtbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:51:41.323475 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.323439 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-util" (OuterVolumeSpecName: "util") pod "bb3f2bf9-f138-412f-8243-3a5b3618cd37" (UID: "bb3f2bf9-f138-412f-8243-3a5b3618cd37"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:51:41.419632 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.419566 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:41.419632 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.419600 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb3f2bf9-f138-412f-8243-3a5b3618cd37-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:41.419632 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:41.419611 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6gtbp\" (UniqueName: \"kubernetes.io/projected/bb3f2bf9-f138-412f-8243-3a5b3618cd37-kube-api-access-6gtbp\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:42.144636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:42.144602 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" Apr 17 20:51:42.144636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:42.144612 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9p7wm7" event={"ID":"bb3f2bf9-f138-412f-8243-3a5b3618cd37","Type":"ContainerDied","Data":"db338e551d9008c97b34140e761a362e7494de925927ee563e6f08e14eb27974"} Apr 17 20:51:42.144861 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:42.144645 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db338e551d9008c97b34140e761a362e7494de925927ee563e6f08e14eb27974" Apr 17 20:51:47.523096 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.523066 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9"] Apr 17 20:51:47.523446 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.523437 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerName="util" Apr 17 20:51:47.523491 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.523452 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerName="util" Apr 17 20:51:47.523491 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.523470 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerName="extract" Apr 17 20:51:47.523491 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.523478 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerName="extract" Apr 17 20:51:47.523582 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.523514 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerName="pull" Apr 17 20:51:47.523582 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.523524 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerName="pull" Apr 17 20:51:47.523650 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.523631 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb3f2bf9-f138-412f-8243-3a5b3618cd37" containerName="extract" Apr 17 20:51:47.526055 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.526042 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.528656 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.528631 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:51:47.528970 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.528950 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-2rm4j\"" Apr 17 20:51:47.529045 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.528971 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:51:47.529045 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.528974 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 20:51:47.529045 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.529017 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 20:51:47.529205 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.529063 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 20:51:47.536434 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.536409 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9"] Apr 17 20:51:47.570964 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.570930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmmnh\" (UniqueName: \"kubernetes.io/projected/8d34bf09-f220-4c50-b3b5-61a87a621564-kube-api-access-cmmnh\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.571155 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.571031 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8d34bf09-f220-4c50-b3b5-61a87a621564-manager-config\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.571155 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.571067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d34bf09-f220-4c50-b3b5-61a87a621564-cert\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.571155 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.571136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d34bf09-f220-4c50-b3b5-61a87a621564-metrics-cert\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.672230 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.672193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d34bf09-f220-4c50-b3b5-61a87a621564-metrics-cert\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.672403 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.672245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmmnh\" (UniqueName: \"kubernetes.io/projected/8d34bf09-f220-4c50-b3b5-61a87a621564-kube-api-access-cmmnh\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.672403 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.672287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8d34bf09-f220-4c50-b3b5-61a87a621564-manager-config\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.672403 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.672306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d34bf09-f220-4c50-b3b5-61a87a621564-cert\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.673016 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.672992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8d34bf09-f220-4c50-b3b5-61a87a621564-manager-config\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.674749 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.674721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d34bf09-f220-4c50-b3b5-61a87a621564-cert\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.674889 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.674754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d34bf09-f220-4c50-b3b5-61a87a621564-metrics-cert\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.679588 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.679569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmmnh\" (UniqueName: \"kubernetes.io/projected/8d34bf09-f220-4c50-b3b5-61a87a621564-kube-api-access-cmmnh\") pod \"lws-controller-manager-7f68665c84-k8xv9\" (UID: \"8d34bf09-f220-4c50-b3b5-61a87a621564\") " pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.835512 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.835474 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:47.966891 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:47.966865 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9"] Apr 17 20:51:47.969117 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:51:47.969087 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d34bf09_f220_4c50_b3b5_61a87a621564.slice/crio-2f42ef01e42ef385358e2e61480a51de49678ef38e161ef961870e829f770f0b WatchSource:0}: Error finding container 2f42ef01e42ef385358e2e61480a51de49678ef38e161ef961870e829f770f0b: Status 404 returned error can't find the container with id 2f42ef01e42ef385358e2e61480a51de49678ef38e161ef961870e829f770f0b Apr 17 20:51:48.168046 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:48.167958 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" event={"ID":"8d34bf09-f220-4c50-b3b5-61a87a621564","Type":"ContainerStarted","Data":"2f42ef01e42ef385358e2e61480a51de49678ef38e161ef961870e829f770f0b"} Apr 17 20:51:50.137760 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:50.137732 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-8z4fv" Apr 17 20:51:50.176853 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:50.176818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" event={"ID":"8d34bf09-f220-4c50-b3b5-61a87a621564","Type":"ContainerStarted","Data":"1cc1a1df03745aa5124080949ba8f9498f459c26e1dee6073d31d0908d64a690"} Apr 17 20:51:50.177048 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:50.176883 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:51:50.195557 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:50.195510 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" podStartSLOduration=1.677002433 podStartE2EDuration="3.195496924s" podCreationTimestamp="2026-04-17 20:51:47 +0000 UTC" firstStartedPulling="2026-04-17 20:51:47.971121094 +0000 UTC m=+459.696388444" lastFinishedPulling="2026-04-17 20:51:49.489615601 +0000 UTC m=+461.214882935" observedRunningTime="2026-04-17 20:51:50.192651603 +0000 UTC m=+461.917918960" watchObservedRunningTime="2026-04-17 20:51:50.195496924 +0000 UTC m=+461.920764281" Apr 17 20:51:54.901164 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:54.901127 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh"] Apr 17 20:51:54.905020 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:54.905005 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:54.906852 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:54.906830 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-74lql\"" Apr 17 20:51:54.907034 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:54.907022 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:51:54.907362 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:54.907341 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:51:54.910651 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:54.910631 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh"] Apr 17 20:51:55.033197 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.033159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.033362 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.033206 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.033362 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.033234 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzprb\" (UniqueName: \"kubernetes.io/projected/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-kube-api-access-mzprb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.134609 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.134581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.134790 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.134619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzprb\" (UniqueName: \"kubernetes.io/projected/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-kube-api-access-mzprb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.134790 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.134689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.134959 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.134940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.135026 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.134975 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.143028 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.143009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzprb\" (UniqueName: \"kubernetes.io/projected/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-kube-api-access-mzprb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.215322 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.215259 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:55.335510 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:55.335486 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh"] Apr 17 20:51:55.337917 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:51:55.337883 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad88b84b_c8d2_46e7_80b7_78ba103b07e9.slice/crio-8e621a26265ae899f292bcf3c522ab725230016f10849cd89ab0fbb75e0d74a6 WatchSource:0}: Error finding container 8e621a26265ae899f292bcf3c522ab725230016f10849cd89ab0fbb75e0d74a6: Status 404 returned error can't find the container with id 8e621a26265ae899f292bcf3c522ab725230016f10849cd89ab0fbb75e0d74a6 Apr 17 20:51:56.198116 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:56.198082 2572 generic.go:358] "Generic (PLEG): container finished" podID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerID="d8d7c1210f2082c92776cf6018e701ad37ebebe81f1a22846f30dc2012daac3c" exitCode=0 Apr 17 20:51:56.198549 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:56.198158 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" event={"ID":"ad88b84b-c8d2-46e7-80b7-78ba103b07e9","Type":"ContainerDied","Data":"d8d7c1210f2082c92776cf6018e701ad37ebebe81f1a22846f30dc2012daac3c"} Apr 17 20:51:56.198549 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:56.198180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" event={"ID":"ad88b84b-c8d2-46e7-80b7-78ba103b07e9","Type":"ContainerStarted","Data":"8e621a26265ae899f292bcf3c522ab725230016f10849cd89ab0fbb75e0d74a6"} Apr 17 20:51:57.203188 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:57.203153 2572 generic.go:358] "Generic (PLEG): container finished" podID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerID="2a79f2b9b0a720c55cbc05aaa6f134beeeadc0205560314cd9a4eb76357bc118" exitCode=0 Apr 17 20:51:57.203698 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:57.203229 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" event={"ID":"ad88b84b-c8d2-46e7-80b7-78ba103b07e9","Type":"ContainerDied","Data":"2a79f2b9b0a720c55cbc05aaa6f134beeeadc0205560314cd9a4eb76357bc118"} Apr 17 20:51:58.208641 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:58.208606 2572 generic.go:358] "Generic (PLEG): container finished" podID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerID="c1f65dac2c68a0d7d9eabde18e9a27437803712a0bf03e65fead97f0af61e03c" exitCode=0 Apr 17 20:51:58.209036 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:58.208683 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" event={"ID":"ad88b84b-c8d2-46e7-80b7-78ba103b07e9","Type":"ContainerDied","Data":"c1f65dac2c68a0d7d9eabde18e9a27437803712a0bf03e65fead97f0af61e03c"} Apr 17 20:51:59.329476 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.329452 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:51:59.472972 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.472895 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzprb\" (UniqueName: \"kubernetes.io/projected/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-kube-api-access-mzprb\") pod \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " Apr 17 20:51:59.472972 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.472960 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-bundle\") pod \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " Apr 17 20:51:59.473184 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.472980 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-util\") pod \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\" (UID: \"ad88b84b-c8d2-46e7-80b7-78ba103b07e9\") " Apr 17 20:51:59.474151 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.474124 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-bundle" (OuterVolumeSpecName: "bundle") pod "ad88b84b-c8d2-46e7-80b7-78ba103b07e9" (UID: "ad88b84b-c8d2-46e7-80b7-78ba103b07e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:51:59.474937 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.474914 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-kube-api-access-mzprb" (OuterVolumeSpecName: "kube-api-access-mzprb") pod "ad88b84b-c8d2-46e7-80b7-78ba103b07e9" (UID: "ad88b84b-c8d2-46e7-80b7-78ba103b07e9"). InnerVolumeSpecName "kube-api-access-mzprb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:51:59.479001 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.478969 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-util" (OuterVolumeSpecName: "util") pod "ad88b84b-c8d2-46e7-80b7-78ba103b07e9" (UID: "ad88b84b-c8d2-46e7-80b7-78ba103b07e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:51:59.574393 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.574347 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mzprb\" (UniqueName: \"kubernetes.io/projected/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-kube-api-access-mzprb\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:59.574393 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.574388 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:51:59.574393 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:51:59.574406 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad88b84b-c8d2-46e7-80b7-78ba103b07e9-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:52:00.217945 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:00.217916 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" Apr 17 20:52:00.218134 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:00.217920 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wchgh" event={"ID":"ad88b84b-c8d2-46e7-80b7-78ba103b07e9","Type":"ContainerDied","Data":"8e621a26265ae899f292bcf3c522ab725230016f10849cd89ab0fbb75e0d74a6"} Apr 17 20:52:00.218134 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:00.218031 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e621a26265ae899f292bcf3c522ab725230016f10849cd89ab0fbb75e0d74a6" Apr 17 20:52:01.182760 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:01.182733 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7f68665c84-k8xv9" Apr 17 20:52:09.651636 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.651596 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9"] Apr 17 20:52:09.652427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.652404 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerName="extract" Apr 17 20:52:09.652427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.652430 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerName="extract" Apr 17 20:52:09.652571 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.652493 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerName="util" Apr 17 20:52:09.652571 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.652502 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerName="util" Apr 17 20:52:09.652571 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.652529 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerName="pull" Apr 17 20:52:09.652571 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.652538 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerName="pull" Apr 17 20:52:09.652754 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.652694 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88b84b-c8d2-46e7-80b7-78ba103b07e9" containerName="extract" Apr 17 20:52:09.662071 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.662044 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.662379 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.662072 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9"] Apr 17 20:52:09.664352 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.664133 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:52:09.664352 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.664157 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:52:09.664352 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.664260 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-74lql\"" Apr 17 20:52:09.765765 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.765735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.765946 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.765787 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxtnr\" (UniqueName: \"kubernetes.io/projected/226db398-0910-4352-b800-be538674c9dc-kube-api-access-hxtnr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.765946 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.765871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.866550 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.866511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.866732 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.866580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.866732 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.866626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxtnr\" (UniqueName: \"kubernetes.io/projected/226db398-0910-4352-b800-be538674c9dc-kube-api-access-hxtnr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.866905 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.866884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.866967 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.866915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.874582 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.874561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxtnr\" (UniqueName: \"kubernetes.io/projected/226db398-0910-4352-b800-be538674c9dc-kube-api-access-hxtnr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:09.972475 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:09.972413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:10.091522 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:10.091498 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9"] Apr 17 20:52:10.093981 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:52:10.093951 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod226db398_0910_4352_b800_be538674c9dc.slice/crio-0e3b2bd720d66a30d93d9067b277387486ae718c95222d92a79ac926db621b5d WatchSource:0}: Error finding container 0e3b2bd720d66a30d93d9067b277387486ae718c95222d92a79ac926db621b5d: Status 404 returned error can't find the container with id 0e3b2bd720d66a30d93d9067b277387486ae718c95222d92a79ac926db621b5d Apr 17 20:52:10.255416 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:10.255335 2572 generic.go:358] "Generic (PLEG): container finished" podID="226db398-0910-4352-b800-be538674c9dc" containerID="e250c1d2b0493295e3c9e5b36caf3302d4c21add7b3db8c62cf04638ca6b28b6" exitCode=0 Apr 17 20:52:10.255544 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:10.255425 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" event={"ID":"226db398-0910-4352-b800-be538674c9dc","Type":"ContainerDied","Data":"e250c1d2b0493295e3c9e5b36caf3302d4c21add7b3db8c62cf04638ca6b28b6"} Apr 17 20:52:10.255544 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:10.255463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" event={"ID":"226db398-0910-4352-b800-be538674c9dc","Type":"ContainerStarted","Data":"0e3b2bd720d66a30d93d9067b277387486ae718c95222d92a79ac926db621b5d"} Apr 17 20:52:12.263527 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:12.263496 2572 generic.go:358] "Generic (PLEG): container finished" podID="226db398-0910-4352-b800-be538674c9dc" containerID="9b44e6073507a6f9c7bdd2f64a1576c4ec13c3444c082901648503453364e720" exitCode=0 Apr 17 20:52:12.264047 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:12.263591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" event={"ID":"226db398-0910-4352-b800-be538674c9dc","Type":"ContainerDied","Data":"9b44e6073507a6f9c7bdd2f64a1576c4ec13c3444c082901648503453364e720"} Apr 17 20:52:13.268510 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:13.268475 2572 generic.go:358] "Generic (PLEG): container finished" podID="226db398-0910-4352-b800-be538674c9dc" containerID="5afa1b8b8c358a7fe01929a6b9d9b36a622a122ff3d696874d10dfa065df939c" exitCode=0 Apr 17 20:52:13.268906 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:13.268559 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" event={"ID":"226db398-0910-4352-b800-be538674c9dc","Type":"ContainerDied","Data":"5afa1b8b8c358a7fe01929a6b9d9b36a622a122ff3d696874d10dfa065df939c"} Apr 17 20:52:14.388992 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.388969 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:14.508938 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.508903 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-bundle\") pod \"226db398-0910-4352-b800-be538674c9dc\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " Apr 17 20:52:14.509103 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.508956 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxtnr\" (UniqueName: \"kubernetes.io/projected/226db398-0910-4352-b800-be538674c9dc-kube-api-access-hxtnr\") pod \"226db398-0910-4352-b800-be538674c9dc\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " Apr 17 20:52:14.509103 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.508999 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-util\") pod \"226db398-0910-4352-b800-be538674c9dc\" (UID: \"226db398-0910-4352-b800-be538674c9dc\") " Apr 17 20:52:14.509821 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.509775 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-bundle" (OuterVolumeSpecName: "bundle") pod "226db398-0910-4352-b800-be538674c9dc" (UID: "226db398-0910-4352-b800-be538674c9dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:52:14.511000 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.510972 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226db398-0910-4352-b800-be538674c9dc-kube-api-access-hxtnr" (OuterVolumeSpecName: "kube-api-access-hxtnr") pod "226db398-0910-4352-b800-be538674c9dc" (UID: "226db398-0910-4352-b800-be538674c9dc"). InnerVolumeSpecName "kube-api-access-hxtnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:52:14.517045 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.517020 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-util" (OuterVolumeSpecName: "util") pod "226db398-0910-4352-b800-be538674c9dc" (UID: "226db398-0910-4352-b800-be538674c9dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:52:14.609675 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.609642 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxtnr\" (UniqueName: \"kubernetes.io/projected/226db398-0910-4352-b800-be538674c9dc-kube-api-access-hxtnr\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:52:14.609675 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.609670 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:52:14.609675 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:14.609679 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/226db398-0910-4352-b800-be538674c9dc-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:52:15.276457 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:15.276422 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" event={"ID":"226db398-0910-4352-b800-be538674c9dc","Type":"ContainerDied","Data":"0e3b2bd720d66a30d93d9067b277387486ae718c95222d92a79ac926db621b5d"} Apr 17 20:52:15.276599 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:15.276464 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e3b2bd720d66a30d93d9067b277387486ae718c95222d92a79ac926db621b5d" Apr 17 20:52:15.276599 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:15.276433 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rvrq9" Apr 17 20:52:21.569274 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.569238 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4"] Apr 17 20:52:21.569855 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.569817 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="226db398-0910-4352-b800-be538674c9dc" containerName="extract" Apr 17 20:52:21.569855 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.569838 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="226db398-0910-4352-b800-be538674c9dc" containerName="extract" Apr 17 20:52:21.569855 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.569851 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="226db398-0910-4352-b800-be538674c9dc" containerName="pull" Apr 17 20:52:21.569855 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.569859 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="226db398-0910-4352-b800-be538674c9dc" containerName="pull" Apr 17 20:52:21.570081 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.569887 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="226db398-0910-4352-b800-be538674c9dc" containerName="util" Apr 17 20:52:21.570081 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.569896 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="226db398-0910-4352-b800-be538674c9dc" containerName="util" Apr 17 20:52:21.570081 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.569979 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="226db398-0910-4352-b800-be538674c9dc" containerName="extract" Apr 17 20:52:21.573088 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.573069 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.575017 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.574993 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-9ngs8\"" Apr 17 20:52:21.575112 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.575083 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 20:52:21.583773 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.583749 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4"] Apr 17 20:52:21.670197 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.670367 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670202 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.670367 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.670367 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.670367 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670294 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.670367 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670341 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.670616 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.670616 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.670616 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.670493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xfxq\" (UniqueName: \"kubernetes.io/projected/53fbeaf4-e0cf-4a00-8d61-654887368d4a-kube-api-access-5xfxq\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771022 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.770991 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771022 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771299 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xfxq\" (UniqueName: \"kubernetes.io/projected/53fbeaf4-e0cf-4a00-8d61-654887368d4a-kube-api-access-5xfxq\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771299 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771299 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771299 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771299 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771162 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771299 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771299 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771701 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771648 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771836 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771918 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.771974 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.771958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.772234 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.772207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.773786 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.773761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.773928 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.773874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.778374 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.778348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xfxq\" (UniqueName: \"kubernetes.io/projected/53fbeaf4-e0cf-4a00-8d61-654887368d4a-kube-api-access-5xfxq\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.778655 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.778633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53fbeaf4-e0cf-4a00-8d61-654887368d4a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4\" (UID: \"53fbeaf4-e0cf-4a00-8d61-654887368d4a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:21.886050 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:21.885973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:22.013005 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:22.012978 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4"] Apr 17 20:52:22.014715 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:52:22.014677 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53fbeaf4_e0cf_4a00_8d61_654887368d4a.slice/crio-2b529cdf3e2248f4e13b445950e2edc518cc6127b1cb8a129359406b02365529 WatchSource:0}: Error finding container 2b529cdf3e2248f4e13b445950e2edc518cc6127b1cb8a129359406b02365529: Status 404 returned error can't find the container with id 2b529cdf3e2248f4e13b445950e2edc518cc6127b1cb8a129359406b02365529 Apr 17 20:52:22.305827 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:22.305776 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" event={"ID":"53fbeaf4-e0cf-4a00-8d61-654887368d4a","Type":"ContainerStarted","Data":"2b529cdf3e2248f4e13b445950e2edc518cc6127b1cb8a129359406b02365529"} Apr 17 20:52:24.500467 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:24.500430 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:52:24.500698 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:24.500508 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:52:24.500698 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:24.500541 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:52:25.317821 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:25.317768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" event={"ID":"53fbeaf4-e0cf-4a00-8d61-654887368d4a","Type":"ContainerStarted","Data":"93b91b12b79f898de05dd89d104cd7f9a567fd4b6c214bde6f7f0413ce681233"} Apr 17 20:52:25.336503 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:25.336427 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" podStartSLOduration=1.852903749 podStartE2EDuration="4.336413112s" podCreationTimestamp="2026-04-17 20:52:21 +0000 UTC" firstStartedPulling="2026-04-17 20:52:22.016711843 +0000 UTC m=+493.741979182" lastFinishedPulling="2026-04-17 20:52:24.500221195 +0000 UTC m=+496.225488545" observedRunningTime="2026-04-17 20:52:25.334681698 +0000 UTC m=+497.059949085" watchObservedRunningTime="2026-04-17 20:52:25.336413112 +0000 UTC m=+497.061680469" Apr 17 20:52:25.886579 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:25.886547 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:25.890977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:25.890947 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:26.321733 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:26.321709 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:26.322686 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:26.322669 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4" Apr 17 20:52:53.362380 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.362309 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lrx5x"] Apr 17 20:52:53.365550 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.365530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" Apr 17 20:52:53.367594 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.367570 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-kd9nx\"" Apr 17 20:52:53.367692 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.367611 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:52:53.368084 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.368065 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:52:53.376753 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.376731 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lrx5x"] Apr 17 20:52:53.434965 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.434935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gzw\" (UniqueName: \"kubernetes.io/projected/804f0cf6-32cc-4390-9bee-f75fd8b880d9-kube-api-access-m2gzw\") pod \"kuadrant-operator-catalog-lrx5x\" (UID: \"804f0cf6-32cc-4390-9bee-f75fd8b880d9\") " pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" Apr 17 20:52:53.536026 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.535989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gzw\" (UniqueName: \"kubernetes.io/projected/804f0cf6-32cc-4390-9bee-f75fd8b880d9-kube-api-access-m2gzw\") pod \"kuadrant-operator-catalog-lrx5x\" (UID: \"804f0cf6-32cc-4390-9bee-f75fd8b880d9\") " pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" Apr 17 20:52:53.543174 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.543150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gzw\" (UniqueName: \"kubernetes.io/projected/804f0cf6-32cc-4390-9bee-f75fd8b880d9-kube-api-access-m2gzw\") pod \"kuadrant-operator-catalog-lrx5x\" (UID: \"804f0cf6-32cc-4390-9bee-f75fd8b880d9\") " pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" Apr 17 20:52:53.675489 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.675402 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" Apr 17 20:52:53.741400 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.741364 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lrx5x"] Apr 17 20:52:53.794271 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.794246 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lrx5x"] Apr 17 20:52:53.795982 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:52:53.795956 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804f0cf6_32cc_4390_9bee_f75fd8b880d9.slice/crio-dd4cae2ad4228027ff6ddedcdcd877f6fa0becf113c31eddc29ac567069510bb WatchSource:0}: Error finding container dd4cae2ad4228027ff6ddedcdcd877f6fa0becf113c31eddc29ac567069510bb: Status 404 returned error can't find the container with id dd4cae2ad4228027ff6ddedcdcd877f6fa0becf113c31eddc29ac567069510bb Apr 17 20:52:53.946579 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.946504 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8wrsk"] Apr 17 20:52:53.951260 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.951241 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:52:53.955160 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:53.955140 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8wrsk"] Apr 17 20:52:54.040277 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:54.040246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5rvj\" (UniqueName: \"kubernetes.io/projected/c5b1e4bb-538f-4cfa-9d66-332b5e79efca-kube-api-access-g5rvj\") pod \"kuadrant-operator-catalog-8wrsk\" (UID: \"c5b1e4bb-538f-4cfa-9d66-332b5e79efca\") " pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:52:54.141284 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:54.141249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5rvj\" (UniqueName: \"kubernetes.io/projected/c5b1e4bb-538f-4cfa-9d66-332b5e79efca-kube-api-access-g5rvj\") pod \"kuadrant-operator-catalog-8wrsk\" (UID: \"c5b1e4bb-538f-4cfa-9d66-332b5e79efca\") " pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:52:54.148256 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:54.148229 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5rvj\" (UniqueName: \"kubernetes.io/projected/c5b1e4bb-538f-4cfa-9d66-332b5e79efca-kube-api-access-g5rvj\") pod \"kuadrant-operator-catalog-8wrsk\" (UID: \"c5b1e4bb-538f-4cfa-9d66-332b5e79efca\") " pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:52:54.262139 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:54.262065 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:52:54.384411 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:54.384355 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8wrsk"] Apr 17 20:52:54.409826 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:52:54.409774 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5b1e4bb_538f_4cfa_9d66_332b5e79efca.slice/crio-a3335b8b8b08e4ce9bf648d67452cb77c337ab95774065e43eb1b072239d4c0f WatchSource:0}: Error finding container a3335b8b8b08e4ce9bf648d67452cb77c337ab95774065e43eb1b072239d4c0f: Status 404 returned error can't find the container with id a3335b8b8b08e4ce9bf648d67452cb77c337ab95774065e43eb1b072239d4c0f Apr 17 20:52:54.424548 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:54.424520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" event={"ID":"804f0cf6-32cc-4390-9bee-f75fd8b880d9","Type":"ContainerStarted","Data":"dd4cae2ad4228027ff6ddedcdcd877f6fa0becf113c31eddc29ac567069510bb"} Apr 17 20:52:54.425661 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:54.425638 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" event={"ID":"c5b1e4bb-538f-4cfa-9d66-332b5e79efca","Type":"ContainerStarted","Data":"a3335b8b8b08e4ce9bf648d67452cb77c337ab95774065e43eb1b072239d4c0f"} Apr 17 20:52:56.434866 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.434827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" event={"ID":"c5b1e4bb-538f-4cfa-9d66-332b5e79efca","Type":"ContainerStarted","Data":"0cc1f5560074480c51775addf31a280f7c8fafbce1e95528c8e5ff0a7d0f2c5f"} Apr 17 20:52:56.436147 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.436122 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" event={"ID":"804f0cf6-32cc-4390-9bee-f75fd8b880d9","Type":"ContainerStarted","Data":"d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488"} Apr 17 20:52:56.436277 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.436148 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" podUID="804f0cf6-32cc-4390-9bee-f75fd8b880d9" containerName="registry-server" containerID="cri-o://d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488" gracePeriod=2 Apr 17 20:52:56.450747 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.450708 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" podStartSLOduration=1.980359994 podStartE2EDuration="3.45069448s" podCreationTimestamp="2026-04-17 20:52:53 +0000 UTC" firstStartedPulling="2026-04-17 20:52:54.411131957 +0000 UTC m=+526.136399293" lastFinishedPulling="2026-04-17 20:52:55.881466433 +0000 UTC m=+527.606733779" observedRunningTime="2026-04-17 20:52:56.447544048 +0000 UTC m=+528.172811404" watchObservedRunningTime="2026-04-17 20:52:56.45069448 +0000 UTC m=+528.175961838" Apr 17 20:52:56.460617 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.460578 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" podStartSLOduration=1.3789773 podStartE2EDuration="3.460565154s" podCreationTimestamp="2026-04-17 20:52:53 +0000 UTC" firstStartedPulling="2026-04-17 20:52:53.797182236 +0000 UTC m=+525.522449571" lastFinishedPulling="2026-04-17 20:52:55.87877009 +0000 UTC m=+527.604037425" observedRunningTime="2026-04-17 20:52:56.459036894 +0000 UTC m=+528.184304251" watchObservedRunningTime="2026-04-17 20:52:56.460565154 +0000 UTC m=+528.185832528" Apr 17 20:52:56.675532 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.675509 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" Apr 17 20:52:56.765211 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.765125 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2gzw\" (UniqueName: \"kubernetes.io/projected/804f0cf6-32cc-4390-9bee-f75fd8b880d9-kube-api-access-m2gzw\") pod \"804f0cf6-32cc-4390-9bee-f75fd8b880d9\" (UID: \"804f0cf6-32cc-4390-9bee-f75fd8b880d9\") " Apr 17 20:52:56.767270 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.767247 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804f0cf6-32cc-4390-9bee-f75fd8b880d9-kube-api-access-m2gzw" (OuterVolumeSpecName: "kube-api-access-m2gzw") pod "804f0cf6-32cc-4390-9bee-f75fd8b880d9" (UID: "804f0cf6-32cc-4390-9bee-f75fd8b880d9"). InnerVolumeSpecName "kube-api-access-m2gzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:52:56.866593 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:56.866557 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m2gzw\" (UniqueName: \"kubernetes.io/projected/804f0cf6-32cc-4390-9bee-f75fd8b880d9-kube-api-access-m2gzw\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:52:57.440713 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.440679 2572 generic.go:358] "Generic (PLEG): container finished" podID="804f0cf6-32cc-4390-9bee-f75fd8b880d9" containerID="d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488" exitCode=0 Apr 17 20:52:57.441139 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.440754 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" Apr 17 20:52:57.441139 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.440766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" event={"ID":"804f0cf6-32cc-4390-9bee-f75fd8b880d9","Type":"ContainerDied","Data":"d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488"} Apr 17 20:52:57.441139 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.440819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lrx5x" event={"ID":"804f0cf6-32cc-4390-9bee-f75fd8b880d9","Type":"ContainerDied","Data":"dd4cae2ad4228027ff6ddedcdcd877f6fa0becf113c31eddc29ac567069510bb"} Apr 17 20:52:57.441139 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.440832 2572 scope.go:117] "RemoveContainer" containerID="d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488" Apr 17 20:52:57.449835 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.449795 2572 scope.go:117] "RemoveContainer" containerID="d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488" Apr 17 20:52:57.450084 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:52:57.450067 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488\": container with ID starting with d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488 not found: ID does not exist" containerID="d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488" Apr 17 20:52:57.450145 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.450094 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488"} err="failed to get container status \"d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488\": rpc error: code = NotFound desc = could not find container \"d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488\": container with ID starting with d057cc410e6502675d34091ae09d7b44ab20329887f1057cfcf41b045bd71488 not found: ID does not exist" Apr 17 20:52:57.459249 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.459228 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lrx5x"] Apr 17 20:52:57.464578 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:57.464556 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lrx5x"] Apr 17 20:52:58.714520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:52:58.714486 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804f0cf6-32cc-4390-9bee-f75fd8b880d9" path="/var/lib/kubelet/pods/804f0cf6-32cc-4390-9bee-f75fd8b880d9/volumes" Apr 17 20:53:04.262447 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:04.262400 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:53:04.262447 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:04.262452 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:53:04.284016 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:04.283991 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:53:04.489203 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:04.489177 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-8wrsk" Apr 17 20:53:08.576322 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.576278 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w"] Apr 17 20:53:08.576979 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.576960 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804f0cf6-32cc-4390-9bee-f75fd8b880d9" containerName="registry-server" Apr 17 20:53:08.577039 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.576982 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="804f0cf6-32cc-4390-9bee-f75fd8b880d9" containerName="registry-server" Apr 17 20:53:08.577089 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.577077 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="804f0cf6-32cc-4390-9bee-f75fd8b880d9" containerName="registry-server" Apr 17 20:53:08.585672 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.585649 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w"] Apr 17 20:53:08.585818 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.585770 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.588032 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.588006 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-d72jt\"" Apr 17 20:53:08.768541 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.768512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.768712 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.768569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbgx9\" (UniqueName: \"kubernetes.io/projected/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-kube-api-access-xbgx9\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.768712 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.768691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.870114 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.870035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.870114 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.870104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbgx9\" (UniqueName: \"kubernetes.io/projected/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-kube-api-access-xbgx9\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.870326 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.870160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.870514 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.870494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.870570 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.870512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.877323 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.877300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbgx9\" (UniqueName: \"kubernetes.io/projected/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-kube-api-access-xbgx9\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.897929 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.897905 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-d72jt\"" Apr 17 20:53:08.907333 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.907313 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:08.975024 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.974994 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2"] Apr 17 20:53:08.980149 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.980128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:08.983604 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:08.983576 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2"] Apr 17 20:53:09.034306 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.034249 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w"] Apr 17 20:53:09.036434 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:53:09.036398 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67a45dd8_2e11_422d_bc6c_a98b6de68e1f.slice/crio-36873967ed9ba9d3bb0d6a1f0c2f6345f632a9969796d9b35ac66a16a4762008 WatchSource:0}: Error finding container 36873967ed9ba9d3bb0d6a1f0c2f6345f632a9969796d9b35ac66a16a4762008: Status 404 returned error can't find the container with id 36873967ed9ba9d3bb0d6a1f0c2f6345f632a9969796d9b35ac66a16a4762008 Apr 17 20:53:09.071785 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.071753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.071907 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.071788 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-597kx\" (UniqueName: \"kubernetes.io/projected/5600ae44-a4ac-4606-973d-5f0aa6ba421a-kube-api-access-597kx\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.071990 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.071968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.172630 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.172598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.172764 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.172657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.172764 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.172685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-597kx\" (UniqueName: \"kubernetes.io/projected/5600ae44-a4ac-4606-973d-5f0aa6ba421a-kube-api-access-597kx\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.173004 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.172972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.173045 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.172985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.179755 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.179734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-597kx\" (UniqueName: \"kubernetes.io/projected/5600ae44-a4ac-4606-973d-5f0aa6ba421a-kube-api-access-597kx\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.294035 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.294002 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:09.415721 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.415696 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2"] Apr 17 20:53:09.417560 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:53:09.417530 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5600ae44_a4ac_4606_973d_5f0aa6ba421a.slice/crio-2796af51b52158cefc41b29238b2554040d660f54fc12a2e310e301a377e3a21 WatchSource:0}: Error finding container 2796af51b52158cefc41b29238b2554040d660f54fc12a2e310e301a377e3a21: Status 404 returned error can't find the container with id 2796af51b52158cefc41b29238b2554040d660f54fc12a2e310e301a377e3a21 Apr 17 20:53:09.486374 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.486351 2572 generic.go:358] "Generic (PLEG): container finished" podID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerID="e53ccedb224b91d371594abd18ac3a4ab3f12dc18260706061caf6b4276a32eb" exitCode=0 Apr 17 20:53:09.486486 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.486430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" event={"ID":"67a45dd8-2e11-422d-bc6c-a98b6de68e1f","Type":"ContainerDied","Data":"e53ccedb224b91d371594abd18ac3a4ab3f12dc18260706061caf6b4276a32eb"} Apr 17 20:53:09.486486 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.486473 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" event={"ID":"67a45dd8-2e11-422d-bc6c-a98b6de68e1f","Type":"ContainerStarted","Data":"36873967ed9ba9d3bb0d6a1f0c2f6345f632a9969796d9b35ac66a16a4762008"} Apr 17 20:53:09.487760 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.487736 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" event={"ID":"5600ae44-a4ac-4606-973d-5f0aa6ba421a","Type":"ContainerStarted","Data":"be31e7a005743a1e1ccc63833abd9c1ffbc4ae3d5e0a4bdc2ce4f81104e96ba8"} Apr 17 20:53:09.487905 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.487766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" event={"ID":"5600ae44-a4ac-4606-973d-5f0aa6ba421a","Type":"ContainerStarted","Data":"2796af51b52158cefc41b29238b2554040d660f54fc12a2e310e301a377e3a21"} Apr 17 20:53:09.577005 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.576965 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx"] Apr 17 20:53:09.580764 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.580744 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.588445 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.588424 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx"] Apr 17 20:53:09.677939 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.677851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.678061 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.677972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.678061 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.678009 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq8tf\" (UniqueName: \"kubernetes.io/projected/c17421b5-364f-4970-a4ae-a9a50d16e23d-kube-api-access-bq8tf\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.778557 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.778525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.778723 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.778564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq8tf\" (UniqueName: \"kubernetes.io/projected/c17421b5-364f-4970-a4ae-a9a50d16e23d-kube-api-access-bq8tf\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.778723 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.778601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.778909 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.778886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.778960 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.778907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.785184 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.785161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq8tf\" (UniqueName: \"kubernetes.io/projected/c17421b5-364f-4970-a4ae-a9a50d16e23d-kube-api-access-bq8tf\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:09.898664 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:09.898633 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:10.019528 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.019502 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx"] Apr 17 20:53:10.021871 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:53:10.021837 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc17421b5_364f_4970_a4ae_a9a50d16e23d.slice/crio-c2d901cadc87ab005375bc0f272f8aaa748c74ede77ba816c7c33157ff5c9196 WatchSource:0}: Error finding container c2d901cadc87ab005375bc0f272f8aaa748c74ede77ba816c7c33157ff5c9196: Status 404 returned error can't find the container with id c2d901cadc87ab005375bc0f272f8aaa748c74ede77ba816c7c33157ff5c9196 Apr 17 20:53:10.181021 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.180987 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf"] Apr 17 20:53:10.184947 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.184924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.190296 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.190274 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf"] Apr 17 20:53:10.283724 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.283677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.283724 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.283705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.283882 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.283856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n55jq\" (UniqueName: \"kubernetes.io/projected/0cf107ca-d674-4bbf-a284-1082b47310b0-kube-api-access-n55jq\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.384497 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.384464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n55jq\" (UniqueName: \"kubernetes.io/projected/0cf107ca-d674-4bbf-a284-1082b47310b0-kube-api-access-n55jq\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.384610 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.384557 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.384610 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.384584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.384937 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.384918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.384980 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.384960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.391427 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.391405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n55jq\" (UniqueName: \"kubernetes.io/projected/0cf107ca-d674-4bbf-a284-1082b47310b0-kube-api-access-n55jq\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.493970 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.493937 2572 generic.go:358] "Generic (PLEG): container finished" podID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerID="54d4be04b5013dae7666dc7b7e4b49673050513f05255d8ad95d5b0084441bdc" exitCode=0 Apr 17 20:53:10.494142 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.494001 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" event={"ID":"67a45dd8-2e11-422d-bc6c-a98b6de68e1f","Type":"ContainerDied","Data":"54d4be04b5013dae7666dc7b7e4b49673050513f05255d8ad95d5b0084441bdc"} Apr 17 20:53:10.495429 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.495407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:10.495429 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.495414 2572 generic.go:358] "Generic (PLEG): container finished" podID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerID="7bfa477f690a4ad98a028465580e294a472973617bdb927ac9044f428e0a7218" exitCode=0 Apr 17 20:53:10.495585 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.495513 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" event={"ID":"c17421b5-364f-4970-a4ae-a9a50d16e23d","Type":"ContainerDied","Data":"7bfa477f690a4ad98a028465580e294a472973617bdb927ac9044f428e0a7218"} Apr 17 20:53:10.495585 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.495544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" event={"ID":"c17421b5-364f-4970-a4ae-a9a50d16e23d","Type":"ContainerStarted","Data":"c2d901cadc87ab005375bc0f272f8aaa748c74ede77ba816c7c33157ff5c9196"} Apr 17 20:53:10.497248 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.497223 2572 generic.go:358] "Generic (PLEG): container finished" podID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerID="be31e7a005743a1e1ccc63833abd9c1ffbc4ae3d5e0a4bdc2ce4f81104e96ba8" exitCode=0 Apr 17 20:53:10.497379 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.497297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" event={"ID":"5600ae44-a4ac-4606-973d-5f0aa6ba421a","Type":"ContainerDied","Data":"be31e7a005743a1e1ccc63833abd9c1ffbc4ae3d5e0a4bdc2ce4f81104e96ba8"} Apr 17 20:53:10.627769 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:10.627746 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf"] Apr 17 20:53:10.629567 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:53:10.629540 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf107ca_d674_4bbf_a284_1082b47310b0.slice/crio-e50b667a2dddeaa4fd80d64a9ab2e6b3f8268850e645536bd20f2d232053a7e9 WatchSource:0}: Error finding container e50b667a2dddeaa4fd80d64a9ab2e6b3f8268850e645536bd20f2d232053a7e9: Status 404 returned error can't find the container with id e50b667a2dddeaa4fd80d64a9ab2e6b3f8268850e645536bd20f2d232053a7e9 Apr 17 20:53:11.502415 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:11.502327 2572 generic.go:358] "Generic (PLEG): container finished" podID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerID="db80662454e69df38828b1cd2e19f49b2af7105ce9d5c2c571c4f6efe739f8b6" exitCode=0 Apr 17 20:53:11.502415 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:11.502399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" event={"ID":"67a45dd8-2e11-422d-bc6c-a98b6de68e1f","Type":"ContainerDied","Data":"db80662454e69df38828b1cd2e19f49b2af7105ce9d5c2c571c4f6efe739f8b6"} Apr 17 20:53:11.503553 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:11.503533 2572 generic.go:358] "Generic (PLEG): container finished" podID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerID="448b05c36d0cbc152dd15969f81512db02af84f4637a222b3803734247cc9c40" exitCode=0 Apr 17 20:53:11.503681 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:11.503616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" event={"ID":"0cf107ca-d674-4bbf-a284-1082b47310b0","Type":"ContainerDied","Data":"448b05c36d0cbc152dd15969f81512db02af84f4637a222b3803734247cc9c40"} Apr 17 20:53:11.503681 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:11.503632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" event={"ID":"0cf107ca-d674-4bbf-a284-1082b47310b0","Type":"ContainerStarted","Data":"e50b667a2dddeaa4fd80d64a9ab2e6b3f8268850e645536bd20f2d232053a7e9"} Apr 17 20:53:11.505237 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:11.505218 2572 generic.go:358] "Generic (PLEG): container finished" podID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerID="48ab209f677eb39daa4581a7f638566030fa77e2c88a26c63f49fca77255b430" exitCode=0 Apr 17 20:53:11.505327 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:11.505281 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" event={"ID":"5600ae44-a4ac-4606-973d-5f0aa6ba421a","Type":"ContainerDied","Data":"48ab209f677eb39daa4581a7f638566030fa77e2c88a26c63f49fca77255b430"} Apr 17 20:53:12.510938 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.510848 2572 generic.go:358] "Generic (PLEG): container finished" podID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerID="72b622dec471d074fe684f56152b7d3e3a47a654d3eabfa7206852071bacd5e1" exitCode=0 Apr 17 20:53:12.511384 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.510928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" event={"ID":"c17421b5-364f-4970-a4ae-a9a50d16e23d","Type":"ContainerDied","Data":"72b622dec471d074fe684f56152b7d3e3a47a654d3eabfa7206852071bacd5e1"} Apr 17 20:53:12.512668 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.512556 2572 generic.go:358] "Generic (PLEG): container finished" podID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerID="750f3abd535a3783e617ea317e45490c52983c52f51f10d65a616ad123cf4c76" exitCode=0 Apr 17 20:53:12.512749 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.512665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" event={"ID":"0cf107ca-d674-4bbf-a284-1082b47310b0","Type":"ContainerDied","Data":"750f3abd535a3783e617ea317e45490c52983c52f51f10d65a616ad123cf4c76"} Apr 17 20:53:12.514716 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.514694 2572 generic.go:358] "Generic (PLEG): container finished" podID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerID="361a536d7b001fcb3100e84d1afff5b5d29225ccba42badd9a51f2bda983d4f9" exitCode=0 Apr 17 20:53:12.514831 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.514765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" event={"ID":"5600ae44-a4ac-4606-973d-5f0aa6ba421a","Type":"ContainerDied","Data":"361a536d7b001fcb3100e84d1afff5b5d29225ccba42badd9a51f2bda983d4f9"} Apr 17 20:53:12.642101 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.642075 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:12.807003 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.806976 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-util\") pod \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " Apr 17 20:53:12.807179 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.807031 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-bundle\") pod \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " Apr 17 20:53:12.807179 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.807089 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbgx9\" (UniqueName: \"kubernetes.io/projected/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-kube-api-access-xbgx9\") pod \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\" (UID: \"67a45dd8-2e11-422d-bc6c-a98b6de68e1f\") " Apr 17 20:53:12.807482 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.807449 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-bundle" (OuterVolumeSpecName: "bundle") pod "67a45dd8-2e11-422d-bc6c-a98b6de68e1f" (UID: "67a45dd8-2e11-422d-bc6c-a98b6de68e1f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:12.809212 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.809185 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-kube-api-access-xbgx9" (OuterVolumeSpecName: "kube-api-access-xbgx9") pod "67a45dd8-2e11-422d-bc6c-a98b6de68e1f" (UID: "67a45dd8-2e11-422d-bc6c-a98b6de68e1f"). InnerVolumeSpecName "kube-api-access-xbgx9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:12.812638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.812615 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-util" (OuterVolumeSpecName: "util") pod "67a45dd8-2e11-422d-bc6c-a98b6de68e1f" (UID: "67a45dd8-2e11-422d-bc6c-a98b6de68e1f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:12.908007 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.907974 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:12.908007 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.908004 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:12.908232 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:12.908020 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbgx9\" (UniqueName: \"kubernetes.io/projected/67a45dd8-2e11-422d-bc6c-a98b6de68e1f-kube-api-access-xbgx9\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:13.520361 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.520324 2572 generic.go:358] "Generic (PLEG): container finished" podID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerID="e0c14fe29c0655eed57906d5988db5da717a587235df6d87bba7e148df327564" exitCode=0 Apr 17 20:53:13.520792 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.520399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" event={"ID":"c17421b5-364f-4970-a4ae-a9a50d16e23d","Type":"ContainerDied","Data":"e0c14fe29c0655eed57906d5988db5da717a587235df6d87bba7e148df327564"} Apr 17 20:53:13.522209 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.522183 2572 generic.go:358] "Generic (PLEG): container finished" podID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerID="5add037918ccbc1a04a93baec6e457d32121ee3ecf908deb8fbfda2342f3882d" exitCode=0 Apr 17 20:53:13.522327 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.522288 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" event={"ID":"0cf107ca-d674-4bbf-a284-1082b47310b0","Type":"ContainerDied","Data":"5add037918ccbc1a04a93baec6e457d32121ee3ecf908deb8fbfda2342f3882d"} Apr 17 20:53:13.524229 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.524210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" event={"ID":"67a45dd8-2e11-422d-bc6c-a98b6de68e1f","Type":"ContainerDied","Data":"36873967ed9ba9d3bb0d6a1f0c2f6345f632a9969796d9b35ac66a16a4762008"} Apr 17 20:53:13.524320 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.524235 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36873967ed9ba9d3bb0d6a1f0c2f6345f632a9969796d9b35ac66a16a4762008" Apr 17 20:53:13.524374 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.524336 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w" Apr 17 20:53:13.649508 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.649487 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:13.815779 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.815745 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-util\") pod \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " Apr 17 20:53:13.815779 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.815789 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-bundle\") pod \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " Apr 17 20:53:13.816018 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.815874 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-597kx\" (UniqueName: \"kubernetes.io/projected/5600ae44-a4ac-4606-973d-5f0aa6ba421a-kube-api-access-597kx\") pod \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\" (UID: \"5600ae44-a4ac-4606-973d-5f0aa6ba421a\") " Apr 17 20:53:13.816341 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.816313 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-bundle" (OuterVolumeSpecName: "bundle") pod "5600ae44-a4ac-4606-973d-5f0aa6ba421a" (UID: "5600ae44-a4ac-4606-973d-5f0aa6ba421a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:13.818044 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.818025 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5600ae44-a4ac-4606-973d-5f0aa6ba421a-kube-api-access-597kx" (OuterVolumeSpecName: "kube-api-access-597kx") pod "5600ae44-a4ac-4606-973d-5f0aa6ba421a" (UID: "5600ae44-a4ac-4606-973d-5f0aa6ba421a"). InnerVolumeSpecName "kube-api-access-597kx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:13.821611 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.821578 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-util" (OuterVolumeSpecName: "util") pod "5600ae44-a4ac-4606-973d-5f0aa6ba421a" (UID: "5600ae44-a4ac-4606-973d-5f0aa6ba421a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:13.916782 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.916746 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:13.916782 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.916777 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-597kx\" (UniqueName: \"kubernetes.io/projected/5600ae44-a4ac-4606-973d-5f0aa6ba421a-kube-api-access-597kx\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:13.917007 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:13.916791 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5600ae44-a4ac-4606-973d-5f0aa6ba421a-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:14.529789 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.529757 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" Apr 17 20:53:14.529789 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.529773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2" event={"ID":"5600ae44-a4ac-4606-973d-5f0aa6ba421a","Type":"ContainerDied","Data":"2796af51b52158cefc41b29238b2554040d660f54fc12a2e310e301a377e3a21"} Apr 17 20:53:14.530300 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.529831 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2796af51b52158cefc41b29238b2554040d660f54fc12a2e310e301a377e3a21" Apr 17 20:53:14.671797 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.671774 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:14.696729 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.696699 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:14.826048 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.826014 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-util\") pod \"c17421b5-364f-4970-a4ae-a9a50d16e23d\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " Apr 17 20:53:14.826217 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.826092 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-util\") pod \"0cf107ca-d674-4bbf-a284-1082b47310b0\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " Apr 17 20:53:14.826217 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.826147 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n55jq\" (UniqueName: \"kubernetes.io/projected/0cf107ca-d674-4bbf-a284-1082b47310b0-kube-api-access-n55jq\") pod \"0cf107ca-d674-4bbf-a284-1082b47310b0\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " Apr 17 20:53:14.826217 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.826173 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq8tf\" (UniqueName: \"kubernetes.io/projected/c17421b5-364f-4970-a4ae-a9a50d16e23d-kube-api-access-bq8tf\") pod \"c17421b5-364f-4970-a4ae-a9a50d16e23d\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " Apr 17 20:53:14.826383 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.826225 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-bundle\") pod \"0cf107ca-d674-4bbf-a284-1082b47310b0\" (UID: \"0cf107ca-d674-4bbf-a284-1082b47310b0\") " Apr 17 20:53:14.826383 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.826341 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-bundle\") pod \"c17421b5-364f-4970-a4ae-a9a50d16e23d\" (UID: \"c17421b5-364f-4970-a4ae-a9a50d16e23d\") " Apr 17 20:53:14.827100 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.827039 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-bundle" (OuterVolumeSpecName: "bundle") pod "c17421b5-364f-4970-a4ae-a9a50d16e23d" (UID: "c17421b5-364f-4970-a4ae-a9a50d16e23d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:14.827209 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.827116 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-bundle" (OuterVolumeSpecName: "bundle") pod "0cf107ca-d674-4bbf-a284-1082b47310b0" (UID: "0cf107ca-d674-4bbf-a284-1082b47310b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:14.828752 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.828724 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf107ca-d674-4bbf-a284-1082b47310b0-kube-api-access-n55jq" (OuterVolumeSpecName: "kube-api-access-n55jq") pod "0cf107ca-d674-4bbf-a284-1082b47310b0" (UID: "0cf107ca-d674-4bbf-a284-1082b47310b0"). InnerVolumeSpecName "kube-api-access-n55jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:14.828893 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.828818 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17421b5-364f-4970-a4ae-a9a50d16e23d-kube-api-access-bq8tf" (OuterVolumeSpecName: "kube-api-access-bq8tf") pod "c17421b5-364f-4970-a4ae-a9a50d16e23d" (UID: "c17421b5-364f-4970-a4ae-a9a50d16e23d"). InnerVolumeSpecName "kube-api-access-bq8tf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:14.832043 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.832022 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-util" (OuterVolumeSpecName: "util") pod "c17421b5-364f-4970-a4ae-a9a50d16e23d" (UID: "c17421b5-364f-4970-a4ae-a9a50d16e23d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:14.832624 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.832594 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-util" (OuterVolumeSpecName: "util") pod "0cf107ca-d674-4bbf-a284-1082b47310b0" (UID: "0cf107ca-d674-4bbf-a284-1082b47310b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:14.927731 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.927698 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:14.927731 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.927726 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n55jq\" (UniqueName: \"kubernetes.io/projected/0cf107ca-d674-4bbf-a284-1082b47310b0-kube-api-access-n55jq\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:14.928056 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.927741 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bq8tf\" (UniqueName: \"kubernetes.io/projected/c17421b5-364f-4970-a4ae-a9a50d16e23d-kube-api-access-bq8tf\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:14.928056 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.927755 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cf107ca-d674-4bbf-a284-1082b47310b0-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:14.928056 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.927766 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:14.928056 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:14.927779 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c17421b5-364f-4970-a4ae-a9a50d16e23d-util\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:53:15.542748 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:15.542707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" event={"ID":"c17421b5-364f-4970-a4ae-a9a50d16e23d","Type":"ContainerDied","Data":"c2d901cadc87ab005375bc0f272f8aaa748c74ede77ba816c7c33157ff5c9196"} Apr 17 20:53:15.542748 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:15.542736 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx" Apr 17 20:53:15.542748 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:15.542747 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2d901cadc87ab005375bc0f272f8aaa748c74ede77ba816c7c33157ff5c9196" Apr 17 20:53:15.544492 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:15.544460 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" event={"ID":"0cf107ca-d674-4bbf-a284-1082b47310b0","Type":"ContainerDied","Data":"e50b667a2dddeaa4fd80d64a9ab2e6b3f8268850e645536bd20f2d232053a7e9"} Apr 17 20:53:15.544492 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:15.544493 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50b667a2dddeaa4fd80d64a9ab2e6b3f8268850e645536bd20f2d232053a7e9" Apr 17 20:53:15.544663 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:15.544473 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf" Apr 17 20:53:25.260351 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260312 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-pxbzh"] Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260686 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerName="extract" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260697 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerName="extract" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260708 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerName="util" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260714 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerName="util" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260725 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerName="util" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260730 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerName="util" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260738 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerName="util" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260742 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerName="util" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260748 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerName="extract" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260754 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerName="extract" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260759 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerName="pull" Apr 17 20:53:25.260758 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260764 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerName="pull" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260772 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerName="pull" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260777 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerName="pull" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260782 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerName="extract" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260788 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerName="extract" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260795 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerName="pull" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260818 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerName="pull" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260826 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerName="util" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260839 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerName="util" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260845 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerName="extract" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260850 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerName="extract" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260856 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerName="pull" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260861 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerName="pull" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260914 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5600ae44-a4ac-4606-973d-5f0aa6ba421a" containerName="extract" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260922 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="67a45dd8-2e11-422d-bc6c-a98b6de68e1f" containerName="extract" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260928 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cf107ca-d674-4bbf-a284-1082b47310b0" containerName="extract" Apr 17 20:53:25.261135 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.260934 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c17421b5-364f-4970-a4ae-a9a50d16e23d" containerName="extract" Apr 17 20:53:25.264321 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.264303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" Apr 17 20:53:25.266265 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.266244 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-kcv6z\"" Apr 17 20:53:25.273068 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.273044 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-pxbzh"] Apr 17 20:53:25.311189 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.311164 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtzsj\" (UniqueName: \"kubernetes.io/projected/7a61a289-8974-49e8-84e5-00f258a378a4-kube-api-access-wtzsj\") pod \"authorino-operator-657f44b778-pxbzh\" (UID: \"7a61a289-8974-49e8-84e5-00f258a378a4\") " pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" Apr 17 20:53:25.412319 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.412289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtzsj\" (UniqueName: \"kubernetes.io/projected/7a61a289-8974-49e8-84e5-00f258a378a4-kube-api-access-wtzsj\") pod \"authorino-operator-657f44b778-pxbzh\" (UID: \"7a61a289-8974-49e8-84e5-00f258a378a4\") " pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" Apr 17 20:53:25.424314 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.424295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtzsj\" (UniqueName: \"kubernetes.io/projected/7a61a289-8974-49e8-84e5-00f258a378a4-kube-api-access-wtzsj\") pod \"authorino-operator-657f44b778-pxbzh\" (UID: \"7a61a289-8974-49e8-84e5-00f258a378a4\") " pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" Apr 17 20:53:25.575343 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.575300 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" Apr 17 20:53:25.695380 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:25.695354 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-pxbzh"] Apr 17 20:53:25.697340 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:53:25.697311 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a61a289_8974_49e8_84e5_00f258a378a4.slice/crio-f0a8738c5c8d0101d29ae07ad1b3e475ae67d173111e8d8f44371840dbf7a6f1 WatchSource:0}: Error finding container f0a8738c5c8d0101d29ae07ad1b3e475ae67d173111e8d8f44371840dbf7a6f1: Status 404 returned error can't find the container with id f0a8738c5c8d0101d29ae07ad1b3e475ae67d173111e8d8f44371840dbf7a6f1 Apr 17 20:53:26.593969 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:26.593927 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" event={"ID":"7a61a289-8974-49e8-84e5-00f258a378a4","Type":"ContainerStarted","Data":"f0a8738c5c8d0101d29ae07ad1b3e475ae67d173111e8d8f44371840dbf7a6f1"} Apr 17 20:53:28.603669 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:28.603638 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" event={"ID":"7a61a289-8974-49e8-84e5-00f258a378a4","Type":"ContainerStarted","Data":"c0cb0158b350e31b71ede5c703874c1b283445221508b87ed11c2454b95c7fd0"} Apr 17 20:53:28.604055 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:28.603818 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" Apr 17 20:53:28.621830 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:28.621771 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" podStartSLOduration=1.471685115 podStartE2EDuration="3.621759851s" podCreationTimestamp="2026-04-17 20:53:25 +0000 UTC" firstStartedPulling="2026-04-17 20:53:25.699666707 +0000 UTC m=+557.424934043" lastFinishedPulling="2026-04-17 20:53:27.849741438 +0000 UTC m=+559.575008779" observedRunningTime="2026-04-17 20:53:28.618781343 +0000 UTC m=+560.344048694" watchObservedRunningTime="2026-04-17 20:53:28.621759851 +0000 UTC m=+560.347027209" Apr 17 20:53:30.384436 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.384407 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f797c447c-qgvtl"] Apr 17 20:53:30.392965 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.392944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.405610 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.405585 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f797c447c-qgvtl"] Apr 17 20:53:30.457969 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.457940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b90ece81-32c7-4785-b359-6704d69fbae3-console-serving-cert\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.458108 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.458039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-oauth-serving-cert\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.458108 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.458087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-service-ca\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.458210 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.458118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-trusted-ca-bundle\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.458210 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.458142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b90ece81-32c7-4785-b359-6704d69fbae3-console-oauth-config\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.458210 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.458173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v564z\" (UniqueName: \"kubernetes.io/projected/b90ece81-32c7-4785-b359-6704d69fbae3-kube-api-access-v564z\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.458210 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.458204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-console-config\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.559150 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-service-ca\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.559150 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-trusted-ca-bundle\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.559401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b90ece81-32c7-4785-b359-6704d69fbae3-console-oauth-config\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.559401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v564z\" (UniqueName: \"kubernetes.io/projected/b90ece81-32c7-4785-b359-6704d69fbae3-kube-api-access-v564z\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.559401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-console-config\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.559401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b90ece81-32c7-4785-b359-6704d69fbae3-console-serving-cert\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.559401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-oauth-serving-cert\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.559928 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-service-ca\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.560036 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-console-config\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.560036 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.559984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-oauth-serving-cert\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.560134 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.560058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b90ece81-32c7-4785-b359-6704d69fbae3-trusted-ca-bundle\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.561722 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.561694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b90ece81-32c7-4785-b359-6704d69fbae3-console-serving-cert\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.561798 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.561706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b90ece81-32c7-4785-b359-6704d69fbae3-console-oauth-config\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.569872 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.569853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v564z\" (UniqueName: \"kubernetes.io/projected/b90ece81-32c7-4785-b359-6704d69fbae3-kube-api-access-v564z\") pod \"console-7f797c447c-qgvtl\" (UID: \"b90ece81-32c7-4785-b359-6704d69fbae3\") " pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.702036 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.701961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:30.823003 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:30.822973 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f797c447c-qgvtl"] Apr 17 20:53:30.823980 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:53:30.823950 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90ece81_32c7_4785_b359_6704d69fbae3.slice/crio-deb6a10a1853ddc04cc717d61f98a0e04b163d0bc320db1cb3b1e344e20ea5b5 WatchSource:0}: Error finding container deb6a10a1853ddc04cc717d61f98a0e04b163d0bc320db1cb3b1e344e20ea5b5: Status 404 returned error can't find the container with id deb6a10a1853ddc04cc717d61f98a0e04b163d0bc320db1cb3b1e344e20ea5b5 Apr 17 20:53:31.616897 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:31.616857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f797c447c-qgvtl" event={"ID":"b90ece81-32c7-4785-b359-6704d69fbae3","Type":"ContainerStarted","Data":"6939e2165bdf963441b7924857e0e9bff56db6c531b4fb520b4e4d3a198e440b"} Apr 17 20:53:31.616897 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:31.616896 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f797c447c-qgvtl" event={"ID":"b90ece81-32c7-4785-b359-6704d69fbae3","Type":"ContainerStarted","Data":"deb6a10a1853ddc04cc717d61f98a0e04b163d0bc320db1cb3b1e344e20ea5b5"} Apr 17 20:53:31.645327 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:31.645279 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f797c447c-qgvtl" podStartSLOduration=1.6452600309999998 podStartE2EDuration="1.645260031s" podCreationTimestamp="2026-04-17 20:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:53:31.64271928 +0000 UTC m=+563.367986637" watchObservedRunningTime="2026-04-17 20:53:31.645260031 +0000 UTC m=+563.370527389" Apr 17 20:53:39.610359 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:39.610329 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-pxbzh" Apr 17 20:53:40.702441 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:40.702408 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:40.702817 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:40.702455 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:40.706742 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:40.706720 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:41.658683 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:41.658657 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f797c447c-qgvtl" Apr 17 20:53:41.708154 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:41.708125 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c477c97d9-dc8qb"] Apr 17 20:53:42.068654 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.068618 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c"] Apr 17 20:53:42.072553 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.072530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.074462 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.074441 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 20:53:42.074577 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.074511 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-d72jt\"" Apr 17 20:53:42.074577 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.074515 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 20:53:42.080326 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.080304 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c"] Apr 17 20:53:42.170977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.170942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/99abd02e-9132-4ff7-82a7-a935c6e01b60-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.170977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.170979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnf2x\" (UniqueName: \"kubernetes.io/projected/99abd02e-9132-4ff7-82a7-a935c6e01b60-kube-api-access-dnf2x\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.171189 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.171059 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/99abd02e-9132-4ff7-82a7-a935c6e01b60-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.272072 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.272035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/99abd02e-9132-4ff7-82a7-a935c6e01b60-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.272072 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.272074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnf2x\" (UniqueName: \"kubernetes.io/projected/99abd02e-9132-4ff7-82a7-a935c6e01b60-kube-api-access-dnf2x\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.272294 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.272124 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/99abd02e-9132-4ff7-82a7-a935c6e01b60-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.272294 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:53:42.272267 2572 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 20:53:42.272368 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:53:42.272341 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99abd02e-9132-4ff7-82a7-a935c6e01b60-plugin-serving-cert podName:99abd02e-9132-4ff7-82a7-a935c6e01b60 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:42.7723199 +0000 UTC m=+574.497587240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/99abd02e-9132-4ff7-82a7-a935c6e01b60-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-82c9c" (UID: "99abd02e-9132-4ff7-82a7-a935c6e01b60") : secret "plugin-serving-cert" not found Apr 17 20:53:42.272814 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.272776 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/99abd02e-9132-4ff7-82a7-a935c6e01b60-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.284305 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.284284 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnf2x\" (UniqueName: \"kubernetes.io/projected/99abd02e-9132-4ff7-82a7-a935c6e01b60-kube-api-access-dnf2x\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.777928 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.777890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/99abd02e-9132-4ff7-82a7-a935c6e01b60-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.785642 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.785616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/99abd02e-9132-4ff7-82a7-a935c6e01b60-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-82c9c\" (UID: \"99abd02e-9132-4ff7-82a7-a935c6e01b60\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:42.984237 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:42.984208 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" Apr 17 20:53:43.112549 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:43.112524 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c"] Apr 17 20:53:43.114125 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:53:43.114088 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99abd02e_9132_4ff7_82a7_a935c6e01b60.slice/crio-4ae1730c7758ea8d8be42082e6f76f74819cae42dbce8fdce823db94ec025fe1 WatchSource:0}: Error finding container 4ae1730c7758ea8d8be42082e6f76f74819cae42dbce8fdce823db94ec025fe1: Status 404 returned error can't find the container with id 4ae1730c7758ea8d8be42082e6f76f74819cae42dbce8fdce823db94ec025fe1 Apr 17 20:53:43.663327 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:43.663293 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" event={"ID":"99abd02e-9132-4ff7-82a7-a935c6e01b60","Type":"ContainerStarted","Data":"4ae1730c7758ea8d8be42082e6f76f74819cae42dbce8fdce823db94ec025fe1"} Apr 17 20:53:52.213003 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.212939 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4"] Apr 17 20:53:52.220857 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.220831 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:53:52.240511 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.236397 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-92nqz\"" Apr 17 20:53:52.241934 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.241901 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4"] Apr 17 20:53:52.374119 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.374080 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/658444f0-18b8-4578-a12e-3f1acae23320-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-spdc4\" (UID: \"658444f0-18b8-4578-a12e-3f1acae23320\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:53:52.374295 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.374190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpdb2\" (UniqueName: \"kubernetes.io/projected/658444f0-18b8-4578-a12e-3f1acae23320-kube-api-access-fpdb2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-spdc4\" (UID: \"658444f0-18b8-4578-a12e-3f1acae23320\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:53:52.475229 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.475151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpdb2\" (UniqueName: \"kubernetes.io/projected/658444f0-18b8-4578-a12e-3f1acae23320-kube-api-access-fpdb2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-spdc4\" (UID: \"658444f0-18b8-4578-a12e-3f1acae23320\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:53:52.475229 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.475224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/658444f0-18b8-4578-a12e-3f1acae23320-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-spdc4\" (UID: \"658444f0-18b8-4578-a12e-3f1acae23320\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:53:52.475613 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.475591 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/658444f0-18b8-4578-a12e-3f1acae23320-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-spdc4\" (UID: \"658444f0-18b8-4578-a12e-3f1acae23320\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:53:52.486581 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.486543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpdb2\" (UniqueName: \"kubernetes.io/projected/658444f0-18b8-4578-a12e-3f1acae23320-kube-api-access-fpdb2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-spdc4\" (UID: \"658444f0-18b8-4578-a12e-3f1acae23320\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:53:52.551992 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:53:52.551960 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:54:04.586924 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:04.586885 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4"] Apr 17 20:54:05.052722 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:05.052696 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4"] Apr 17 20:54:05.054260 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:54:05.054233 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658444f0_18b8_4578_a12e_3f1acae23320.slice/crio-6dd2d393f1548fa0d4eebe75e95e4bf64372fc84979feebe386efbb5393ae3ca WatchSource:0}: Error finding container 6dd2d393f1548fa0d4eebe75e95e4bf64372fc84979feebe386efbb5393ae3ca: Status 404 returned error can't find the container with id 6dd2d393f1548fa0d4eebe75e95e4bf64372fc84979feebe386efbb5393ae3ca Apr 17 20:54:05.759723 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:05.759686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" event={"ID":"99abd02e-9132-4ff7-82a7-a935c6e01b60","Type":"ContainerStarted","Data":"e647d07fb7a4dffa203e8c2c28910f28addbfbabe6bb0147488a5d4c8bd97365"} Apr 17 20:54:05.761259 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:05.761203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" event={"ID":"658444f0-18b8-4578-a12e-3f1acae23320","Type":"ContainerStarted","Data":"6dd2d393f1548fa0d4eebe75e95e4bf64372fc84979feebe386efbb5393ae3ca"} Apr 17 20:54:05.775445 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:05.775393 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-82c9c" podStartSLOduration=1.881754453 podStartE2EDuration="23.775376435s" podCreationTimestamp="2026-04-17 20:53:42 +0000 UTC" firstStartedPulling="2026-04-17 20:53:43.115226346 +0000 UTC m=+574.840493693" lastFinishedPulling="2026-04-17 20:54:05.008848337 +0000 UTC m=+596.734115675" observedRunningTime="2026-04-17 20:54:05.773303984 +0000 UTC m=+597.498571341" watchObservedRunningTime="2026-04-17 20:54:05.775376435 +0000 UTC m=+597.500643792" Apr 17 20:54:06.732853 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:06.732788 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c477c97d9-dc8qb" podUID="cd91522a-fa69-4614-8392-367a32d29a4d" containerName="console" containerID="cri-o://455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364" gracePeriod=15 Apr 17 20:54:06.996556 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:06.996492 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c477c97d9-dc8qb_cd91522a-fa69-4614-8392-367a32d29a4d/console/0.log" Apr 17 20:54:06.996962 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:06.996636 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:54:07.115749 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.115720 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r828h\" (UniqueName: \"kubernetes.io/projected/cd91522a-fa69-4614-8392-367a32d29a4d-kube-api-access-r828h\") pod \"cd91522a-fa69-4614-8392-367a32d29a4d\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " Apr 17 20:54:07.115749 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.115752 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-oauth-config\") pod \"cd91522a-fa69-4614-8392-367a32d29a4d\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " Apr 17 20:54:07.116001 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.115775 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-service-ca\") pod \"cd91522a-fa69-4614-8392-367a32d29a4d\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " Apr 17 20:54:07.116001 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.115843 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-trusted-ca-bundle\") pod \"cd91522a-fa69-4614-8392-367a32d29a4d\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " Apr 17 20:54:07.116001 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.115874 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-serving-cert\") pod \"cd91522a-fa69-4614-8392-367a32d29a4d\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " Apr 17 20:54:07.116001 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.115902 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-oauth-serving-cert\") pod \"cd91522a-fa69-4614-8392-367a32d29a4d\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " Apr 17 20:54:07.116001 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.115974 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-console-config\") pod \"cd91522a-fa69-4614-8392-367a32d29a4d\" (UID: \"cd91522a-fa69-4614-8392-367a32d29a4d\") " Apr 17 20:54:07.116309 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.116275 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-service-ca" (OuterVolumeSpecName: "service-ca") pod "cd91522a-fa69-4614-8392-367a32d29a4d" (UID: "cd91522a-fa69-4614-8392-367a32d29a4d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:07.116548 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.116499 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-console-config" (OuterVolumeSpecName: "console-config") pod "cd91522a-fa69-4614-8392-367a32d29a4d" (UID: "cd91522a-fa69-4614-8392-367a32d29a4d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:07.116548 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.116510 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cd91522a-fa69-4614-8392-367a32d29a4d" (UID: "cd91522a-fa69-4614-8392-367a32d29a4d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:07.116674 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.116546 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cd91522a-fa69-4614-8392-367a32d29a4d" (UID: "cd91522a-fa69-4614-8392-367a32d29a4d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:07.118029 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.118006 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cd91522a-fa69-4614-8392-367a32d29a4d" (UID: "cd91522a-fa69-4614-8392-367a32d29a4d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:07.118366 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.118339 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd91522a-fa69-4614-8392-367a32d29a4d-kube-api-access-r828h" (OuterVolumeSpecName: "kube-api-access-r828h") pod "cd91522a-fa69-4614-8392-367a32d29a4d" (UID: "cd91522a-fa69-4614-8392-367a32d29a4d"). InnerVolumeSpecName "kube-api-access-r828h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:54:07.118487 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.118466 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cd91522a-fa69-4614-8392-367a32d29a4d" (UID: "cd91522a-fa69-4614-8392-367a32d29a4d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:07.217315 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.217276 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r828h\" (UniqueName: \"kubernetes.io/projected/cd91522a-fa69-4614-8392-367a32d29a4d-kube-api-access-r828h\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:07.217315 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.217314 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-oauth-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:07.217500 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.217324 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-service-ca\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:07.217500 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.217333 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-trusted-ca-bundle\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:07.217500 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.217343 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd91522a-fa69-4614-8392-367a32d29a4d-console-serving-cert\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:07.217500 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.217374 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-oauth-serving-cert\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:07.217500 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.217384 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd91522a-fa69-4614-8392-367a32d29a4d-console-config\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:07.773112 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.773084 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c477c97d9-dc8qb_cd91522a-fa69-4614-8392-367a32d29a4d/console/0.log" Apr 17 20:54:07.773279 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.773125 2572 generic.go:358] "Generic (PLEG): container finished" podID="cd91522a-fa69-4614-8392-367a32d29a4d" containerID="455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364" exitCode=2 Apr 17 20:54:07.773279 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.773163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c477c97d9-dc8qb" event={"ID":"cd91522a-fa69-4614-8392-367a32d29a4d","Type":"ContainerDied","Data":"455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364"} Apr 17 20:54:07.773279 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.773186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c477c97d9-dc8qb" event={"ID":"cd91522a-fa69-4614-8392-367a32d29a4d","Type":"ContainerDied","Data":"518189afaf1ddd8ec941e46a60e7f2b307bb38d65b51193c91f7597f5cac82fd"} Apr 17 20:54:07.773279 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.773188 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c477c97d9-dc8qb" Apr 17 20:54:07.773279 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.773203 2572 scope.go:117] "RemoveContainer" containerID="455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364" Apr 17 20:54:07.788511 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.788493 2572 scope.go:117] "RemoveContainer" containerID="455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364" Apr 17 20:54:07.788730 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:54:07.788714 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364\": container with ID starting with 455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364 not found: ID does not exist" containerID="455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364" Apr 17 20:54:07.788786 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.788737 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364"} err="failed to get container status \"455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364\": rpc error: code = NotFound desc = could not find container \"455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364\": container with ID starting with 455663c7f69497e4b4fd0c61da90ac4e15d58f185a1b1ac209c025a55573b364 not found: ID does not exist" Apr 17 20:54:07.797240 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.797214 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c477c97d9-dc8qb"] Apr 17 20:54:07.800460 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:07.800440 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c477c97d9-dc8qb"] Apr 17 20:54:08.715292 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:08.715260 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd91522a-fa69-4614-8392-367a32d29a4d" path="/var/lib/kubelet/pods/cd91522a-fa69-4614-8392-367a32d29a4d/volumes" Apr 17 20:54:09.159318 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:09.159297 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:54:09.159464 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:09.159383 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:54:09.785401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:09.785363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" event={"ID":"658444f0-18b8-4578-a12e-3f1acae23320","Type":"ContainerStarted","Data":"a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c"} Apr 17 20:54:09.785401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:09.785396 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" podUID="658444f0-18b8-4578-a12e-3f1acae23320" containerName="manager" containerID="cri-o://a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c" gracePeriod=10 Apr 17 20:54:09.785401 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:09.785411 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:54:09.809662 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:09.809605 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" podStartSLOduration=13.678194232 podStartE2EDuration="17.809590439s" podCreationTimestamp="2026-04-17 20:53:52 +0000 UTC" firstStartedPulling="2026-04-17 20:54:05.056346621 +0000 UTC m=+596.781613958" lastFinishedPulling="2026-04-17 20:54:09.187742829 +0000 UTC m=+600.913010165" observedRunningTime="2026-04-17 20:54:09.800612947 +0000 UTC m=+601.525880304" watchObservedRunningTime="2026-04-17 20:54:09.809590439 +0000 UTC m=+601.534857796" Apr 17 20:54:10.532866 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.532844 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:54:10.645708 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.645670 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpdb2\" (UniqueName: \"kubernetes.io/projected/658444f0-18b8-4578-a12e-3f1acae23320-kube-api-access-fpdb2\") pod \"658444f0-18b8-4578-a12e-3f1acae23320\" (UID: \"658444f0-18b8-4578-a12e-3f1acae23320\") " Apr 17 20:54:10.645906 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.645766 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/658444f0-18b8-4578-a12e-3f1acae23320-extensions-socket-volume\") pod \"658444f0-18b8-4578-a12e-3f1acae23320\" (UID: \"658444f0-18b8-4578-a12e-3f1acae23320\") " Apr 17 20:54:10.646226 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.646201 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/658444f0-18b8-4578-a12e-3f1acae23320-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "658444f0-18b8-4578-a12e-3f1acae23320" (UID: "658444f0-18b8-4578-a12e-3f1acae23320"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:54:10.647739 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.647687 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658444f0-18b8-4578-a12e-3f1acae23320-kube-api-access-fpdb2" (OuterVolumeSpecName: "kube-api-access-fpdb2") pod "658444f0-18b8-4578-a12e-3f1acae23320" (UID: "658444f0-18b8-4578-a12e-3f1acae23320"). InnerVolumeSpecName "kube-api-access-fpdb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:54:10.746977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.746948 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/658444f0-18b8-4578-a12e-3f1acae23320-extensions-socket-volume\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:10.746977 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.746975 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fpdb2\" (UniqueName: \"kubernetes.io/projected/658444f0-18b8-4578-a12e-3f1acae23320-kube-api-access-fpdb2\") on node \"ip-10-0-137-102.ec2.internal\" DevicePath \"\"" Apr 17 20:54:10.789996 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.789964 2572 generic.go:358] "Generic (PLEG): container finished" podID="658444f0-18b8-4578-a12e-3f1acae23320" containerID="a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c" exitCode=0 Apr 17 20:54:10.790348 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.790028 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" Apr 17 20:54:10.790348 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.790031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" event={"ID":"658444f0-18b8-4578-a12e-3f1acae23320","Type":"ContainerDied","Data":"a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c"} Apr 17 20:54:10.790348 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.790131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4" event={"ID":"658444f0-18b8-4578-a12e-3f1acae23320","Type":"ContainerDied","Data":"6dd2d393f1548fa0d4eebe75e95e4bf64372fc84979feebe386efbb5393ae3ca"} Apr 17 20:54:10.790348 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.790145 2572 scope.go:117] "RemoveContainer" containerID="a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c" Apr 17 20:54:10.798653 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.798636 2572 scope.go:117] "RemoveContainer" containerID="a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c" Apr 17 20:54:10.798919 ip-10-0-137-102 kubenswrapper[2572]: E0417 20:54:10.798899 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c\": container with ID starting with a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c not found: ID does not exist" containerID="a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c" Apr 17 20:54:10.798984 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.798927 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c"} err="failed to get container status \"a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c\": rpc error: code = NotFound desc = could not find container \"a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c\": container with ID starting with a3eb60b7692fde7148ea00faa727a7e56d6b2783e88223cf119c1c5216f2f37c not found: ID does not exist" Apr 17 20:54:10.805069 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.805049 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4"] Apr 17 20:54:10.809090 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:10.809070 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-spdc4"] Apr 17 20:54:12.713601 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:12.713566 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658444f0-18b8-4578-a12e-3f1acae23320" path="/var/lib/kubelet/pods/658444f0-18b8-4578-a12e-3f1acae23320/volumes" Apr 17 20:54:20.806307 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.806213 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb"] Apr 17 20:54:20.806856 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.806832 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="658444f0-18b8-4578-a12e-3f1acae23320" containerName="manager" Apr 17 20:54:20.807027 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.807014 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="658444f0-18b8-4578-a12e-3f1acae23320" containerName="manager" Apr 17 20:54:20.807181 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.807162 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd91522a-fa69-4614-8392-367a32d29a4d" containerName="console" Apr 17 20:54:20.807181 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.807181 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd91522a-fa69-4614-8392-367a32d29a4d" containerName="console" Apr 17 20:54:20.807324 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.807290 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="658444f0-18b8-4578-a12e-3f1acae23320" containerName="manager" Apr 17 20:54:20.807324 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.807311 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd91522a-fa69-4614-8392-367a32d29a4d" containerName="console" Apr 17 20:54:20.818235 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.818207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.820850 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.820288 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-phtth\"" Apr 17 20:54:20.823227 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.823196 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb"] Apr 17 20:54:20.927568 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.927731 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927573 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.927731 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927603 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.927731 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927623 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72h4p\" (UniqueName: \"kubernetes.io/projected/84b13877-31ca-4a62-93d7-502c83f5e077-kube-api-access-72h4p\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.927883 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/84b13877-31ca-4a62-93d7-502c83f5e077-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.927883 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.927883 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/84b13877-31ca-4a62-93d7-502c83f5e077-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.928007 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:20.928007 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:20.927944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/84b13877-31ca-4a62-93d7-502c83f5e077-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.029403 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.029403 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/84b13877-31ca-4a62-93d7-502c83f5e077-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.029676 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.029676 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.029676 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.029676 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029557 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72h4p\" (UniqueName: \"kubernetes.io/projected/84b13877-31ca-4a62-93d7-502c83f5e077-kube-api-access-72h4p\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.029676 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/84b13877-31ca-4a62-93d7-502c83f5e077-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.029676 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029662 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.030010 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.029696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/84b13877-31ca-4a62-93d7-502c83f5e077-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.030405 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.030380 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.030550 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.030477 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/84b13877-31ca-4a62-93d7-502c83f5e077-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.030891 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.030870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.031240 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.031222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.031789 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.031768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.035854 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.035286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/84b13877-31ca-4a62-93d7-502c83f5e077-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.038739 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.037449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/84b13877-31ca-4a62-93d7-502c83f5e077-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.045476 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.045451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/84b13877-31ca-4a62-93d7-502c83f5e077-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.047122 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.047103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72h4p\" (UniqueName: \"kubernetes.io/projected/84b13877-31ca-4a62-93d7-502c83f5e077-kube-api-access-72h4p\") pod \"maas-default-gateway-openshift-default-58b6f876-8fdfb\" (UID: \"84b13877-31ca-4a62-93d7-502c83f5e077\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.132113 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.132036 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:21.258520 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.258497 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb"] Apr 17 20:54:21.260140 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:54:21.260110 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84b13877_31ca_4a62_93d7_502c83f5e077.slice/crio-d8f4ba05706b57d69cbcd87dd44ebbf7ff8d8f78dda0dbb71829f0ec54f02dd7 WatchSource:0}: Error finding container d8f4ba05706b57d69cbcd87dd44ebbf7ff8d8f78dda0dbb71829f0ec54f02dd7: Status 404 returned error can't find the container with id d8f4ba05706b57d69cbcd87dd44ebbf7ff8d8f78dda0dbb71829f0ec54f02dd7 Apr 17 20:54:21.262210 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.262178 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:54:21.262290 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.262243 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:54:21.262290 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.262276 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:54:21.837114 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.837077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" event={"ID":"84b13877-31ca-4a62-93d7-502c83f5e077","Type":"ContainerStarted","Data":"9f1930375f1080071c950e72e185627bcaa95ceb9cf8e0334a56d0e5c49fc3ad"} Apr 17 20:54:21.837114 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.837114 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" event={"ID":"84b13877-31ca-4a62-93d7-502c83f5e077","Type":"ContainerStarted","Data":"d8f4ba05706b57d69cbcd87dd44ebbf7ff8d8f78dda0dbb71829f0ec54f02dd7"} Apr 17 20:54:21.856069 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:21.856027 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" podStartSLOduration=1.85600875 podStartE2EDuration="1.85600875s" podCreationTimestamp="2026-04-17 20:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:54:21.853259796 +0000 UTC m=+613.578527176" watchObservedRunningTime="2026-04-17 20:54:21.85600875 +0000 UTC m=+613.581276106" Apr 17 20:54:22.133012 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:22.132933 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:22.137860 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:22.137835 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:22.841606 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:22.841573 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:22.842592 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:22.842578 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8fdfb" Apr 17 20:54:35.828638 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:35.828605 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pbdxx"] Apr 17 20:54:35.898955 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:35.898924 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pbdxx"] Apr 17 20:54:35.898955 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:35.898958 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pbdxx"] Apr 17 20:54:35.899161 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:35.899010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:35.901053 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:35.901027 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 20:54:36.067718 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.067681 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/404b0624-a29f-467a-8cb9-1b322eba1036-config-file\") pod \"limitador-limitador-78c99df468-pbdxx\" (UID: \"404b0624-a29f-467a-8cb9-1b322eba1036\") " pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:36.067896 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.067756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvn6n\" (UniqueName: \"kubernetes.io/projected/404b0624-a29f-467a-8cb9-1b322eba1036-kube-api-access-fvn6n\") pod \"limitador-limitador-78c99df468-pbdxx\" (UID: \"404b0624-a29f-467a-8cb9-1b322eba1036\") " pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:36.169188 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.169114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvn6n\" (UniqueName: \"kubernetes.io/projected/404b0624-a29f-467a-8cb9-1b322eba1036-kube-api-access-fvn6n\") pod \"limitador-limitador-78c99df468-pbdxx\" (UID: \"404b0624-a29f-467a-8cb9-1b322eba1036\") " pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:36.169357 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.169203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/404b0624-a29f-467a-8cb9-1b322eba1036-config-file\") pod \"limitador-limitador-78c99df468-pbdxx\" (UID: \"404b0624-a29f-467a-8cb9-1b322eba1036\") " pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:36.169752 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.169732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/404b0624-a29f-467a-8cb9-1b322eba1036-config-file\") pod \"limitador-limitador-78c99df468-pbdxx\" (UID: \"404b0624-a29f-467a-8cb9-1b322eba1036\") " pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:36.175918 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.175900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvn6n\" (UniqueName: \"kubernetes.io/projected/404b0624-a29f-467a-8cb9-1b322eba1036-kube-api-access-fvn6n\") pod \"limitador-limitador-78c99df468-pbdxx\" (UID: \"404b0624-a29f-467a-8cb9-1b322eba1036\") " pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:36.210005 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.209980 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:36.340562 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.340537 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pbdxx"] Apr 17 20:54:36.342524 ip-10-0-137-102 kubenswrapper[2572]: W0417 20:54:36.342493 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod404b0624_a29f_467a_8cb9_1b322eba1036.slice/crio-6e1d8437b40bc1f655c3da4a902d7359a23efd7b495045f87861fc7b06b2279e WatchSource:0}: Error finding container 6e1d8437b40bc1f655c3da4a902d7359a23efd7b495045f87861fc7b06b2279e: Status 404 returned error can't find the container with id 6e1d8437b40bc1f655c3da4a902d7359a23efd7b495045f87861fc7b06b2279e Apr 17 20:54:36.897817 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:36.897632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" event={"ID":"404b0624-a29f-467a-8cb9-1b322eba1036","Type":"ContainerStarted","Data":"6e1d8437b40bc1f655c3da4a902d7359a23efd7b495045f87861fc7b06b2279e"} Apr 17 20:54:39.911698 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:39.911655 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" event={"ID":"404b0624-a29f-467a-8cb9-1b322eba1036","Type":"ContainerStarted","Data":"d29738e08275c7476a84487789d705166910325e89daccaf69268da95e0b31e8"} Apr 17 20:54:39.912158 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:39.911734 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:54:39.928541 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:39.928489 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" podStartSLOduration=2.373100052 podStartE2EDuration="4.92847621s" podCreationTimestamp="2026-04-17 20:54:35 +0000 UTC" firstStartedPulling="2026-04-17 20:54:36.344365608 +0000 UTC m=+628.069632946" lastFinishedPulling="2026-04-17 20:54:38.899741769 +0000 UTC m=+630.625009104" observedRunningTime="2026-04-17 20:54:39.925091816 +0000 UTC m=+631.650359173" watchObservedRunningTime="2026-04-17 20:54:39.92847621 +0000 UTC m=+631.653743583" Apr 17 20:54:50.917436 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:54:50.917405 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-pbdxx" Apr 17 20:59:09.196855 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:59:09.196799 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 20:59:09.197460 ip-10-0-137-102 kubenswrapper[2572]: I0417 20:59:09.197440 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 21:04:09.231343 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:04:09.231316 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 21:04:09.233725 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:04:09.233489 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 21:05:51.704841 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:51.704787 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6dc4849f89-8z4fv_9167dae3-8f4c-426c-b9f7-095acc43072f/manager/0.log" Apr 17 21:05:52.791158 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:52.791131 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w_67a45dd8-2e11-422d-bc6c-a98b6de68e1f/util/0.log" Apr 17 21:05:52.797509 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:52.797490 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w_67a45dd8-2e11-422d-bc6c-a98b6de68e1f/pull/0.log" Apr 17 21:05:52.803465 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:52.803445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w_67a45dd8-2e11-422d-bc6c-a98b6de68e1f/extract/0.log" Apr 17 21:05:52.905534 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:52.905510 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2_5600ae44-a4ac-4606-973d-5f0aa6ba421a/util/0.log" Apr 17 21:05:52.912085 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:52.912065 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2_5600ae44-a4ac-4606-973d-5f0aa6ba421a/pull/0.log" Apr 17 21:05:52.918229 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:52.918199 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2_5600ae44-a4ac-4606-973d-5f0aa6ba421a/extract/0.log" Apr 17 21:05:53.028669 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.028640 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf_0cf107ca-d674-4bbf-a284-1082b47310b0/util/0.log" Apr 17 21:05:53.035039 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.035014 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf_0cf107ca-d674-4bbf-a284-1082b47310b0/pull/0.log" Apr 17 21:05:53.041343 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.041294 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf_0cf107ca-d674-4bbf-a284-1082b47310b0/extract/0.log" Apr 17 21:05:53.154306 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.154284 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx_c17421b5-364f-4970-a4ae-a9a50d16e23d/util/0.log" Apr 17 21:05:53.161301 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.161283 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx_c17421b5-364f-4970-a4ae-a9a50d16e23d/pull/0.log" Apr 17 21:05:53.167646 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.167627 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx_c17421b5-364f-4970-a4ae-a9a50d16e23d/extract/0.log" Apr 17 21:05:53.391885 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.391789 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-pxbzh_7a61a289-8974-49e8-84e5-00f258a378a4/manager/0.log" Apr 17 21:05:53.616346 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.616321 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-82c9c_99abd02e-9132-4ff7-82a7-a935c6e01b60/kuadrant-console-plugin/0.log" Apr 17 21:05:53.730717 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.730645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-8wrsk_c5b1e4bb-538f-4cfa-9d66-332b5e79efca/registry-server/0.log" Apr 17 21:05:53.954541 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:53.954515 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-pbdxx_404b0624-a29f-467a-8cb9-1b322eba1036/limitador/0.log" Apr 17 21:05:54.396217 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:54.396188 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4_53fbeaf4-e0cf-4a00-8d61-654887368d4a/istio-proxy/0.log" Apr 17 21:05:54.833467 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:54.833443 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-8fdfb_84b13877-31ca-4a62-93d7-502c83f5e077/istio-proxy/0.log" Apr 17 21:05:54.943991 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:05:54.943962 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-798db9665d-pb48m_b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b/router/0.log" Apr 17 21:06:01.374824 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:01.374778 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gk7jn_57c365ae-f2db-4533-9e62-b1193ccbe5c8/global-pull-secret-syncer/0.log" Apr 17 21:06:01.417177 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:01.417152 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bqh55_f090e67f-4a19-4f45-a57b-d70ee9f84598/konnectivity-agent/0.log" Apr 17 21:06:01.485088 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:01.485060 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-102.ec2.internal_7a467953d27af2c37f58628655d1416c/haproxy/0.log" Apr 17 21:06:05.078365 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.078334 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w_67a45dd8-2e11-422d-bc6c-a98b6de68e1f/extract/0.log" Apr 17 21:06:05.134575 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.134548 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w_67a45dd8-2e11-422d-bc6c-a98b6de68e1f/util/0.log" Apr 17 21:06:05.155475 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.155448 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75929n2w_67a45dd8-2e11-422d-bc6c-a98b6de68e1f/pull/0.log" Apr 17 21:06:05.201414 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.201386 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2_5600ae44-a4ac-4606-973d-5f0aa6ba421a/extract/0.log" Apr 17 21:06:05.229885 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.229858 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2_5600ae44-a4ac-4606-973d-5f0aa6ba421a/util/0.log" Apr 17 21:06:05.255008 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.254982 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02tzs2_5600ae44-a4ac-4606-973d-5f0aa6ba421a/pull/0.log" Apr 17 21:06:05.283552 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.283525 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf_0cf107ca-d674-4bbf-a284-1082b47310b0/extract/0.log" Apr 17 21:06:05.305994 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.305969 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf_0cf107ca-d674-4bbf-a284-1082b47310b0/util/0.log" Apr 17 21:06:05.327426 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.327400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q6lmf_0cf107ca-d674-4bbf-a284-1082b47310b0/pull/0.log" Apr 17 21:06:05.354269 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.354194 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx_c17421b5-364f-4970-a4ae-a9a50d16e23d/extract/0.log" Apr 17 21:06:05.375490 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.375467 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx_c17421b5-364f-4970-a4ae-a9a50d16e23d/util/0.log" Apr 17 21:06:05.403193 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.403168 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1g7ghx_c17421b5-364f-4970-a4ae-a9a50d16e23d/pull/0.log" Apr 17 21:06:05.451385 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.451361 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-pxbzh_7a61a289-8974-49e8-84e5-00f258a378a4/manager/0.log" Apr 17 21:06:05.498397 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.498373 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-82c9c_99abd02e-9132-4ff7-82a7-a935c6e01b60/kuadrant-console-plugin/0.log" Apr 17 21:06:05.559979 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.559953 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-8wrsk_c5b1e4bb-538f-4cfa-9d66-332b5e79efca/registry-server/0.log" Apr 17 21:06:05.630755 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:05.630681 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-pbdxx_404b0624-a29f-467a-8cb9-1b322eba1036/limitador/0.log" Apr 17 21:06:06.901694 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:06.901668 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_352184b8-c491-4213-8df2-6b0459566690/alertmanager/0.log" Apr 17 21:06:06.923430 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:06.923406 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_352184b8-c491-4213-8df2-6b0459566690/config-reloader/0.log" Apr 17 21:06:06.944905 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:06.944885 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_352184b8-c491-4213-8df2-6b0459566690/kube-rbac-proxy-web/0.log" Apr 17 21:06:06.967510 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:06.967483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_352184b8-c491-4213-8df2-6b0459566690/kube-rbac-proxy/0.log" Apr 17 21:06:06.988450 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:06.988427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_352184b8-c491-4213-8df2-6b0459566690/kube-rbac-proxy-metric/0.log" Apr 17 21:06:07.011341 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.011317 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_352184b8-c491-4213-8df2-6b0459566690/prom-label-proxy/0.log" Apr 17 21:06:07.046366 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.046338 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_352184b8-c491-4213-8df2-6b0459566690/init-config-reloader/0.log" Apr 17 21:06:07.127625 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.127595 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cdmls_eac69935-3fd7-4a29-9949-4a6e86df9ae0/kube-state-metrics/0.log" Apr 17 21:06:07.152188 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.152130 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cdmls_eac69935-3fd7-4a29-9949-4a6e86df9ae0/kube-rbac-proxy-main/0.log" Apr 17 21:06:07.174899 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.174874 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cdmls_eac69935-3fd7-4a29-9949-4a6e86df9ae0/kube-rbac-proxy-self/0.log" Apr 17 21:06:07.225916 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.225884 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-j7x6j_ad299635-2784-4210-a8f7-45838c9eab4a/monitoring-plugin/0.log" Apr 17 21:06:07.256961 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.256933 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hbxqp_6bbfd575-a18e-4b5a-b924-f78e8e962f05/node-exporter/0.log" Apr 17 21:06:07.282121 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.282092 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hbxqp_6bbfd575-a18e-4b5a-b924-f78e8e962f05/kube-rbac-proxy/0.log" Apr 17 21:06:07.304326 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.304301 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hbxqp_6bbfd575-a18e-4b5a-b924-f78e8e962f05/init-textfile/0.log" Apr 17 21:06:07.493436 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.493405 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cdg4h_d4390455-ccf6-4110-ab32-8ce17b8d9693/kube-rbac-proxy-main/0.log" Apr 17 21:06:07.518279 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.518254 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cdg4h_d4390455-ccf6-4110-ab32-8ce17b8d9693/kube-rbac-proxy-self/0.log" Apr 17 21:06:07.541608 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.541582 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cdg4h_d4390455-ccf6-4110-ab32-8ce17b8d9693/openshift-state-metrics/0.log" Apr 17 21:06:07.731534 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.731454 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-mlb85_39b7343d-bfa3-457e-8d3b-64225e9b2a48/prometheus-operator/0.log" Apr 17 21:06:07.759521 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.759480 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-mlb85_39b7343d-bfa3-457e-8d3b-64225e9b2a48/kube-rbac-proxy/0.log" Apr 17 21:06:07.788476 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.788450 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-pfxbt_50669672-c6e8-45ba-a563-ff0834c4d0bf/prometheus-operator-admission-webhook/0.log" Apr 17 21:06:07.821904 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.821873 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f4f46dddc-srlmd_2031370c-07c1-4314-93b9-89184c8ab731/telemeter-client/0.log" Apr 17 21:06:07.843594 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.843569 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f4f46dddc-srlmd_2031370c-07c1-4314-93b9-89184c8ab731/reload/0.log" Apr 17 21:06:07.864870 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.864848 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f4f46dddc-srlmd_2031370c-07c1-4314-93b9-89184c8ab731/kube-rbac-proxy/0.log" Apr 17 21:06:07.896678 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.896648 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85dd6fc5-s6bh8_75c7ed08-d830-49b4-bf0c-9283732d0744/thanos-query/0.log" Apr 17 21:06:07.918727 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.918694 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85dd6fc5-s6bh8_75c7ed08-d830-49b4-bf0c-9283732d0744/kube-rbac-proxy-web/0.log" Apr 17 21:06:07.939797 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.939772 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85dd6fc5-s6bh8_75c7ed08-d830-49b4-bf0c-9283732d0744/kube-rbac-proxy/0.log" Apr 17 21:06:07.960166 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.960143 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85dd6fc5-s6bh8_75c7ed08-d830-49b4-bf0c-9283732d0744/prom-label-proxy/0.log" Apr 17 21:06:07.984463 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:07.984399 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85dd6fc5-s6bh8_75c7ed08-d830-49b4-bf0c-9283732d0744/kube-rbac-proxy-rules/0.log" Apr 17 21:06:08.021706 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:08.021682 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85dd6fc5-s6bh8_75c7ed08-d830-49b4-bf0c-9283732d0744/kube-rbac-proxy-metrics/0.log" Apr 17 21:06:09.897821 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:09.897777 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx"] Apr 17 21:06:09.901787 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:09.901770 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:09.903751 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:09.903725 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-znlkx\"/\"kube-root-ca.crt\"" Apr 17 21:06:09.903886 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:09.903772 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-znlkx\"/\"openshift-service-ca.crt\"" Apr 17 21:06:09.903940 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:09.903883 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-znlkx\"/\"default-dockercfg-9snlf\"" Apr 17 21:06:09.910496 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:09.910474 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx"] Apr 17 21:06:10.034449 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.034410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-podres\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.034449 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.034448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzks\" (UniqueName: \"kubernetes.io/projected/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-kube-api-access-6mzks\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.034669 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.034527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-proc\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.034669 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.034591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-sys\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.034669 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.034610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-lib-modules\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136046 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-podres\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136233 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzks\" (UniqueName: \"kubernetes.io/projected/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-kube-api-access-6mzks\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136233 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-podres\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136233 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-proc\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136399 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-sys\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136399 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136274 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-lib-modules\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136514 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-lib-modules\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136572 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136524 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-proc\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.136572 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.136557 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-sys\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.144478 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.144454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzks\" (UniqueName: \"kubernetes.io/projected/5311a835-a6ae-4e05-9f2b-2627e0aa2b9a-kube-api-access-6mzks\") pod \"perf-node-gather-daemonset-lmpwx\" (UID: \"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.200078 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.200014 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f797c447c-qgvtl_b90ece81-32c7-4785-b359-6704d69fbae3/console/0.log" Apr 17 21:06:10.212036 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.212016 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.231262 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.231237 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-ddvr7_bcde0453-95f9-4d89-8211-99b7ac7f0b68/download-server/0.log" Apr 17 21:06:10.335591 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.335559 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx"] Apr 17 21:06:10.337005 ip-10-0-137-102 kubenswrapper[2572]: W0417 21:06:10.336970 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5311a835_a6ae_4e05_9f2b_2627e0aa2b9a.slice/crio-70dc8e0e0ff0adae76e286737d03545c3a96b5d4779a7a7423129d1bf5a22fc9 WatchSource:0}: Error finding container 70dc8e0e0ff0adae76e286737d03545c3a96b5d4779a7a7423129d1bf5a22fc9: Status 404 returned error can't find the container with id 70dc8e0e0ff0adae76e286737d03545c3a96b5d4779a7a7423129d1bf5a22fc9 Apr 17 21:06:10.338570 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.338548 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:06:10.635028 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.634992 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" event={"ID":"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a","Type":"ContainerStarted","Data":"b34ea35653344d20d3737c815f187efc385c46d5b8bd30129539ef48939ef32c"} Apr 17 21:06:10.635028 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.635033 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" event={"ID":"5311a835-a6ae-4e05-9f2b-2627e0aa2b9a","Type":"ContainerStarted","Data":"70dc8e0e0ff0adae76e286737d03545c3a96b5d4779a7a7423129d1bf5a22fc9"} Apr 17 21:06:10.635252 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.635081 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:10.652670 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:10.652621 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" podStartSLOduration=1.652608971 podStartE2EDuration="1.652608971s" podCreationTimestamp="2026-04-17 21:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:06:10.649708504 +0000 UTC m=+1322.374975872" watchObservedRunningTime="2026-04-17 21:06:10.652608971 +0000 UTC m=+1322.377876328" Apr 17 21:06:11.596025 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:11.595997 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zbsnq_c75b554f-1463-4bda-8049-f6e4988ffef7/dns/0.log" Apr 17 21:06:11.615471 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:11.615446 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zbsnq_c75b554f-1463-4bda-8049-f6e4988ffef7/kube-rbac-proxy/0.log" Apr 17 21:06:11.676923 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:11.676889 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kk8ft_4e6f44f7-3183-4cd5-9000-e9662459d6af/dns-node-resolver/0.log" Apr 17 21:06:12.154629 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:12.154601 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vh6vg_73fda819-3b91-4892-92f0-995a9a9014c8/node-ca/0.log" Apr 17 21:06:12.903459 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:12.903427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfq7lg4_53fbeaf4-e0cf-4a00-8d61-654887368d4a/istio-proxy/0.log" Apr 17 21:06:13.006235 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:13.006209 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-8fdfb_84b13877-31ca-4a62-93d7-502c83f5e077/istio-proxy/0.log" Apr 17 21:06:13.029699 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:13.029674 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-798db9665d-pb48m_b366eb11-7fda-4ca8-ae8e-3fb7b0b24a8b/router/0.log" Apr 17 21:06:13.502024 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:13.501996 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6qvxz_79374dd1-1272-4edf-9d10-449bca8feb97/serve-healthcheck-canary/0.log" Apr 17 21:06:14.045437 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:14.045413 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nxrn6_4957c690-d1ba-429f-802c-407963355d82/kube-rbac-proxy/0.log" Apr 17 21:06:14.066502 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:14.066479 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nxrn6_4957c690-d1ba-429f-802c-407963355d82/exporter/0.log" Apr 17 21:06:14.088178 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:14.088152 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nxrn6_4957c690-d1ba-429f-802c-407963355d82/extractor/0.log" Apr 17 21:06:15.957117 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:15.957093 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6dc4849f89-8z4fv_9167dae3-8f4c-426c-b9f7-095acc43072f/manager/0.log" Apr 17 21:06:16.648442 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:16.648419 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-lmpwx" Apr 17 21:06:17.095362 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:17.095336 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7f68665c84-k8xv9_8d34bf09-f220-4c50-b3b5-61a87a621564/manager/0.log" Apr 17 21:06:22.674418 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:22.674392 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66wwv_486c966c-4220-4865-a777-76f49bb4fa62/kube-multus-additional-cni-plugins/0.log" Apr 17 21:06:22.696306 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:22.696276 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66wwv_486c966c-4220-4865-a777-76f49bb4fa62/egress-router-binary-copy/0.log" Apr 17 21:06:22.717912 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:22.717889 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66wwv_486c966c-4220-4865-a777-76f49bb4fa62/cni-plugins/0.log" Apr 17 21:06:22.738162 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:22.738121 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66wwv_486c966c-4220-4865-a777-76f49bb4fa62/bond-cni-plugin/0.log" Apr 17 21:06:22.758914 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:22.758892 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66wwv_486c966c-4220-4865-a777-76f49bb4fa62/routeoverride-cni/0.log" Apr 17 21:06:22.779576 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:22.779557 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66wwv_486c966c-4220-4865-a777-76f49bb4fa62/whereabouts-cni-bincopy/0.log" Apr 17 21:06:22.803332 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:22.803315 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66wwv_486c966c-4220-4865-a777-76f49bb4fa62/whereabouts-cni/0.log" Apr 17 21:06:23.106407 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:23.106377 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k7xlk_691501fe-3ef5-4d58-bc4a-7ce8a3702e4d/kube-multus/0.log" Apr 17 21:06:23.176522 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:23.176492 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gggxp_83681f84-53f3-489d-9b30-0db22fc1b40e/network-metrics-daemon/0.log" Apr 17 21:06:23.198255 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:23.198228 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gggxp_83681f84-53f3-489d-9b30-0db22fc1b40e/kube-rbac-proxy/0.log" Apr 17 21:06:24.555894 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.555868 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-controller/0.log" Apr 17 21:06:24.572478 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.572448 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/0.log" Apr 17 21:06:24.577899 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.577882 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovn-acl-logging/1.log" Apr 17 21:06:24.596677 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.596653 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/kube-rbac-proxy-node/0.log" Apr 17 21:06:24.620716 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.620699 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 21:06:24.639418 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.639401 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/northd/0.log" Apr 17 21:06:24.659899 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.659881 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/nbdb/0.log" Apr 17 21:06:24.680388 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.680374 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/sbdb/0.log" Apr 17 21:06:24.775770 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:24.775745 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-trqtd_02dfb3c4-9530-4d2f-a953-075c7fc184b1/ovnkube-controller/0.log" Apr 17 21:06:25.847284 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:25.847254 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4595v_3c8131bf-3394-4f77-956d-2b283e575873/network-check-target-container/0.log" Apr 17 21:06:26.814465 ip-10-0-137-102 kubenswrapper[2572]: I0417 21:06:26.814439 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-542j9_ff83db54-9c7a-4cea-8c98-f941a157f101/iptables-alerter/0.log"